Feb 26 19:39:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:39:18.342 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:39:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:39:18.342 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:39:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:39:18.342 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:40:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:40:18.342 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:40:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:40:18.343 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:40:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:40:18.343 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:41:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:41:18.344 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:41:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:41:18.345 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:41:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:41:18.345 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:42:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:42:18.345 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:42:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:42:18.346 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:42:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:42:18.346 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:43:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:43:18.346 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:43:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:43:18.347 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:43:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:43:18.347 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:44:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:44:18.346 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:44:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:44:18.347 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:44:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:44:18.347 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:45:14 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:14.893 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 19:45:14 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:14.894 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.944 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.945 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.945 28888 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.945 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.946 28982 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:45:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:15.946 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.954 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.954 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.957 28888 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.957 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.958 28982 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:45:17 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:17.958 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 19:45:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:18.347 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:45:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:18.348 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:45:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:18.349 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:45:22 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:22.027 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:22 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:22.029 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:45:26 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:26.029 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 19:45:26 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:26.029 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 19:45:26 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:26.034 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 19:45:26 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:26.034 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 19:45:45 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:45.171 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 19:45:45 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:45.184 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 19:45:45 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:45.195 28888 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:e8:83', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:f1:2d:b2:55:89'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 19:45:45 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:45.197 28888 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 19:45:45 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:45:45.198 28888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=567e1821-1671-4620-bfa0-94f2bee81e17, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 19:46:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:46:18.348 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:46:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:46:18.348 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:46:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:46:18.349 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:47:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:47:18.350 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:47:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:47:18.350 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:47:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:47:18.350 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:48:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:48:18.352 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:48:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:48:18.352 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:48:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:48:18.352 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:49:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:49:18.353 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:49:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:49:18.353 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:49:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:49:18.353 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:50:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:50:18.354 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:50:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:50:18.355 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:50:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:50:18.355 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:51:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:51:18.355 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:51:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:51:18.356 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:51:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:51:18.356 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:52:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:18.357 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:52:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:18.357 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:52:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:18.358 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:52:41 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:41.598 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 19:52:41 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:41.599 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.614 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.615 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.616 28982 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.616 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.618 28888 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:52:42 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:42.618 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 19:52:44 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:44.624 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:44 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:44.624 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:46 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:46.626 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 19:52:46 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:46.626 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 19:52:46 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:46.626 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 19:52:46 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:46.627 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.642 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.646 28982 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.646 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.648 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.649 28888 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 19:52:50 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:52:50.649 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 19:53:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:53:18.358 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:53:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:53:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:53:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:53:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.775 28888 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.785 28888 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2a:e8:83', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:f1:2d:b2:55:89'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.787 28888 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 19:54:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:18.822 28982 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 19:54:27 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:27.791 28888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=567e1821-1671-4620-bfa0-94f2bee81e17, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 19:54:27 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:54:27.792 28888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 19:55:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:55:18.359 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:55:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:55:18.360 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:55:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:55:18.360 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:56:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:56:18.361 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:56:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:56:18.362 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:56:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:56:18.362 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:57:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:57:18.361 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 19:57:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:57:18.362 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 19:57:18 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:57:18.362 28888 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.267 28982 INFO eventlet.wsgi.server [-] (28982) wsgi exited, is_accepting=True Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.268 28982 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.268 28982 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.268 28982 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.271 28888 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.435 28888 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.435 28888 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.435 28888 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.436 28888 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.436 28888 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.436 28888 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.436 28888 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 26 19:58:15 edpm-compute-0 ovn_metadata_agent[28883]: 2026-02-26 19:58:15.436 28888 INFO oslo_service.service [-] Child 28982 exited with status 0 Feb 26 19:58:17 edpm-compute-0 podman[174626]: Error: no container with ID 4a45b28d8c9a233040d6cf5aff6be92dd339d7301d9f7be27c2198106f4a0874 found in database: no such container Feb 26 19:58:17 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 19:58:17 edpm-compute-0 podman[174642]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 26 19:58:17 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 19:58:17 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 26 19:58:17 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 26 19:58:17 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 26 19:58:17 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 26 19:58:17 edpm-compute-0 edpm-start-podman-container[174671]: ovn_metadata_agent Feb 26 19:58:17 edpm-compute-0 edpm-start-podman-container[174669]: Creating additional drop-in dependency for "ovn_metadata_agent" (8c8fff92f65fe11d42fb3b03a59411c6289bc5a0f8a772713ef55ff2aee133f9) Feb 26 19:58:18 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.