Mar 16 18:45:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:45:11.803 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:45:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:45:11.804 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:45:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:45:11.804 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:46:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:46:11.805 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:46:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:46:11.806 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:46:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:46:11.807 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:47:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:47:11.807 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:47:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:47:11.808 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:47:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:47:11.808 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:48:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:11.808 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:48:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:11.809 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:48:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:11.809 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:48:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:11.999 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 16 18:48:12 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:11.998 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.037 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.038 28854 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.038 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.039 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.040 28952 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 16 18:48:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:13.040 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 16 18:48:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:15.081 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:15.108 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:17 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:17.082 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:48:17 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:17.083 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 16 18:48:17 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:17.111 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:48:17 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:17.112 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 16 18:48:21 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:21.093 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:21 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:21.129 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:48:25 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:25.095 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:48:25 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:25.095 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 16 18:48:25 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:25.134 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:48:25 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:48:25.134 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 16 18:49:06 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:06.158 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 16 18:49:06 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:06.171 28854 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:0c:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:fa:82:2c:77:91'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 16 18:49:06 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:06.172 28854 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 16 18:49:06 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:06.173 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2b8cf37e-0419-48d0-ac7c-ece6cde879cb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 16 18:49:06 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:06.223 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 16 18:49:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:11.809 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:49:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:11.810 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:49:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:49:11.810 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:50:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:50:11.811 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:50:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:50:11.813 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:50:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:50:11.814 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:51:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:51:11.813 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:51:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:51:11.814 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:51:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:51:11.814 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:52:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:52:11.815 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:52:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:52:11.817 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:52:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:52:11.818 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:53:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:53:11.815 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:53:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:53:11.816 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:53:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:53:11.816 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:54:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:54:11.817 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:54:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:54:11.817 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:54:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:54:11.818 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:55:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:11.818 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:55:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:11.819 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:55:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:11.819 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:55:38 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:38.113 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 16 18:55:38 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:38.113 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 16 18:55:39 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:39.201 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:39 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:39.202 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:40.203 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:55:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:40.203 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 16 18:55:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:40.204 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:55:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:40.205 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 16 18:55:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:42.215 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:42.215 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:44.217 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:55:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:44.217 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 16 18:55:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:44.218 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 16 18:55:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:44.219 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.225 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.243 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.245 28854 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.245 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.247 28952 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 16 18:55:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:55:48.247 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 16 18:56:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:56:11.821 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:56:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:56:11.821 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:56:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:56:11.822 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:57:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:11.822 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:57:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:11.823 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:57:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:11.823 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:57:24 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:24.395 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 16 18:57:24 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:24.403 28854 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:0c:42', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '72:fa:82:2c:77:91'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 16 18:57:24 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:24.404 28854 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 16 18:57:24 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:24.436 28952 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 16 18:57:32 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:32.406 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2b8cf37e-0419-48d0-ac7c-ece6cde879cb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 16 18:57:32 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:57:32.407 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Mar 16 18:58:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:58:11.824 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:58:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:58:11.824 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:58:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:58:11.824 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 18:59:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:59:11.826 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 18:59:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:59:11.827 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 18:59:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 18:59:11.827 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 19:00:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:11.827 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 16 19:00:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:11.827 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 16 19:00:11 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:11.827 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.050 28952 INFO eventlet.wsgi.server [-] (28952) wsgi exited, is_accepting=True Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.051 28952 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.051 28952 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.051 28952 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.054 28854 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.325 28854 INFO oslo_service.service [-] Caught SIGTERM, stopping children Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.325 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.326 28854 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.326 28854 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.326 28854 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.327 28854 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.327 28854 INFO oslo_service.service [-] Waiting on 1 children to exit Mar 16 19:00:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-03-16 19:00:53.327 28854 INFO oslo_service.service [-] Child 28952 exited with status 0 Mar 16 19:00:56 edpm-compute-0 podman[174603]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Mar 16 19:00:56 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 16 19:00:56 edpm-compute-0 podman[174621]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Mar 16 19:00:56 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 16 19:00:56 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Mar 16 19:00:56 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Mar 16 19:00:56 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Mar 16 19:00:56 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Mar 16 19:00:56 edpm-compute-0 edpm-start-podman-container[174647]: ovn_metadata_agent Mar 16 19:00:56 edpm-compute-0 edpm-start-podman-container[174644]: Creating additional drop-in dependency for "ovn_metadata_agent" (aafa5008386340431a9571dd775a32de09d3d9220264f55b37ca0bdd6c6cb195) Mar 16 19:00:57 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.