Feb 25 17:09:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:09:25.048 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:09:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:09:25.049 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:09:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:09:25.049 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:10:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:10:25.048 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:10:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:10:25.049 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:10:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:10:25.049 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:11:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:11:25.050 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:11:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:11:25.050 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:11:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:11:25.050 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:12:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:12:25.050 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:12:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:12:25.051 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:12:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:12:25.051 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:13:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:13:25.053 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:13:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:13:25.054 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:13:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:13:25.054 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:14:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:14:25.055 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:14:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:14:25.056 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:14:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:14:25.056 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:15:15 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:15.048 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 25 17:15:15 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:15.056 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 25 17:15:17 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:17.956 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:17 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:17.961 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:18 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:18.959 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:18 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:18.959 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 25 17:15:18 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:18.962 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:18 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:18.963 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 25 17:15:20 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:20.971 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:20 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:20.972 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:22.973 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:22.974 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 25 17:15:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:22.974 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:22.974 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 25 17:15:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:25.056 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:15:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:25.056 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:15:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:25.056 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:15:26 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:26.981 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:26 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:26.986 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:15:30 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:30.983 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:30 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:30.983 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 25 17:15:30 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:30.991 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:15:30 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:15:30.991 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 25 17:16:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:25.058 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:16:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:25.058 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:16:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:25.059 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:16:34 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:34.231 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 25 17:16:34 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:34.235 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 25 17:16:34 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:34.239 28842 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:64:9d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '96:de:71:0f:27:9c'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 25 17:16:34 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:34.240 28842 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 25 17:16:34 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:16:34.240 28842 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=064196f6-e0f5-45fe-95a0-228fe4fe8825, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 25 17:17:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:17:25.060 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:17:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:17:25.061 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:17:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:17:25.061 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:18:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:18:25.062 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:18:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:18:25.062 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:18:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:18:25.063 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:19:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:19:25.062 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:19:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:19:25.063 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:19:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:19:25.063 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:20:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:20:25.064 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:20:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:20:25.064 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:20:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:20:25.064 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:21:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:21:25.065 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:21:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:21:25.065 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:21:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:21:25.065 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:22:06 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:06.593 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 25 17:22:06 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:06.593 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 25 17:22:07 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:07.608 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:07 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:07.609 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:08 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:08.609 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:22:08 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:08.610 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 25 17:22:08 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:08.610 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:22:08 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:08.611 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 25 17:22:10 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:10.646 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:10 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:10.647 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:12 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:12.648 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:22:12 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:12.648 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 25 17:22:12 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:12.649 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 25 17:22:12 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:12.649 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.678 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.679 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.681 28842 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.681 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.681 28959 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 25 17:22:16 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:16.681 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 25 17:22:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:25.066 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:22:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:25.066 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:22:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:22:25.067 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:23:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:25.067 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:23:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:25.067 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:23:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:25.067 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:23:44 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:44.785 28842 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 25 17:23:44 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:44.809 28842 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:64:9d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '96:de:71:0f:27:9c'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 25 17:23:44 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:44.812 28842 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 25 17:23:44 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:44.828 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 25 17:23:50 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:50.815 28842 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=064196f6-e0f5-45fe-95a0-228fe4fe8825, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 25 17:23:50 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:23:50.816 28842 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 25 17:24:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:24:25.068 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:24:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:24:25.069 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:24:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:24:25.069 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:25:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:25:25.069 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:25:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:25:25.070 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:25:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:25:25.070 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:26:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:26:25.070 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 25 17:26:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:26:25.070 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 25 17:26:25 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:26:25.071 28842 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 25 17:27:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:22.466 28959 INFO eventlet.wsgi.server [-] (28959) wsgi exited, is_accepting=True Feb 25 17:27:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:22.467 28959 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 25 17:27:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:22.468 28959 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 25 17:27:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:22.468 28959 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 25 17:27:22 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:22.469 28842 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.064 28842 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.065 28842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.065 28842 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.065 28842 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.065 28842 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.065 28842 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.066 28842 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 25 17:27:23 edpm-compute-0 ovn_metadata_agent[28837]: 2026-02-25 17:27:23.066 28842 INFO oslo_service.service [-] Child 28959 exited with status 0 Feb 25 17:27:23 edpm-compute-0 conmon[28837]: conmon b0b6b17c0a8ccdb55c97 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b0b6b17c0a8ccdb55c977783bad7113785fe1ad65d62f94e440422f0ddeaf12f.scope/container/memory.events Feb 25 17:27:26 edpm-compute-0 podman[174336]: Error: no container with ID b0b6b17c0a8ccdb55c977783bad7113785fe1ad65d62f94e440422f0ddeaf12f found in database: no such container Feb 25 17:27:26 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 25 17:27:26 edpm-compute-0 podman[174355]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 25 17:27:26 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 25 17:27:26 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 25 17:27:26 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 25 17:27:26 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 25 17:27:26 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 25 17:27:26 edpm-compute-0 edpm-start-podman-container[174385]: ovn_metadata_agent Feb 25 17:27:26 edpm-compute-0 edpm-start-podman-container[174383]: Creating additional drop-in dependency for "ovn_metadata_agent" (816a4202b1e6ac6481b13dc5bf62b2c617d6beffee7bd4d0d1c171819b654bfd) Feb 25 17:27:27 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.