Mar 12 13:24:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:24:31.050 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:24:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:24:31.059 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:24:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:24:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:25:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:25:31.051 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:25:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:25:31.052 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:25:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:25:31.052 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:26:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:26:31.053 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:26:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:26:31.054 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:26:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:26:31.054 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:27:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:31.055 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:27:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:31.055 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:27:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:31.055 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:27:52 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:52.267 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 12 13:27:52 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:52.269 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.315 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.315 28853 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.315 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.320 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.322 28950 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:53 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:53.323 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.321 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.323 28853 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.323 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.340 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.340 28950 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:55 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:55.341 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.332 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.333 28853 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.334 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.353 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.354 28950 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:27:59 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:27:59.355 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 12 13:28:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:31.056 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:28:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:31.057 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:28:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:31.057 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:28:47 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:47.454 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 12 13:28:47 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:47.464 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 12 13:28:47 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:47.466 28853 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ee:ed:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'aa:ff:dc:ea:fe:22'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 12 13:28:47 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:47.467 28853 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 12 13:28:47 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:28:47.468 28853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=42598e7f-f78d-4d2e-99c5-0f9dbcebf13c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 12 13:29:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:29:31.057 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:29:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:29:31.058 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:29:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:29:31.058 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:30:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:30:31.059 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:30:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:30:31.060 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:30:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:30:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:31:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:31:31.060 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:31:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:31:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:31:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:31:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:32:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:32:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:32:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:32:31.062 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:32:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:32:31.062 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:33:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:33:31.061 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:33:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:33:31.062 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:33:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:33:31.062 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:34:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:34:31.062 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:34:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:34:31.063 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:34:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:34:31.063 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:35:19 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:19.198 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 12 13:35:19 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:19.200 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 12 13:35:20 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:20.225 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:20 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:20.225 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:21 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:21.227 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 12 13:35:21 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:21.239 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 12 13:35:21 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:21.254 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 12 13:35:21 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:21.268 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 12 13:35:23 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:23.236 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:23 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:23.266 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:25 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:25.238 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 12 13:35:25 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:25.239 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 12 13:35:25 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:25.269 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 12 13:35:25 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:25.270 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.246 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.248 28853 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.249 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.284 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.284 28950 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 12 13:35:29 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:29.285 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 12 13:35:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:31.063 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:35:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:31.063 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:35:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:35:31.063 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:36:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:31.064 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:36:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:31.065 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:36:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:31.065 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:36:49 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:49.383 28853 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 12 13:36:49 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:49.422 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 12 13:36:49 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:49.432 28853 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ee:ed:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'aa:ff:dc:ea:fe:22'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 12 13:36:49 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:49.435 28853 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 12 13:36:51 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:51.438 28853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=42598e7f-f78d-4d2e-99c5-0f9dbcebf13c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 12 13:36:51 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:36:51.439 28853 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Mar 12 13:37:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:37:31.066 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:37:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:37:31.066 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:37:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:37:31.066 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:38:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:38:31.067 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:38:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:38:31.068 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:38:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:38:31.069 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:39:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:39:31.068 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:39:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:39:31.069 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:39:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:39:31.069 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:40:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:31.070 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 12 13:40:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:31.070 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 12 13:40:31 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:31.071 28853 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 12 13:40:36 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:36.953 28950 INFO eventlet.wsgi.server [-] (28950) wsgi exited, is_accepting=True Mar 12 13:40:36 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:36.953 28950 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 12 13:40:36 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:36.954 28950 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 12 13:40:36 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:36.954 28950 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 12 13:40:36 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:36.956 28853 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.433 28853 INFO oslo_service.service [-] Caught SIGTERM, stopping children Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.433 28853 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.433 28853 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.434 28853 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.434 28853 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.435 28853 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.435 28853 INFO oslo_service.service [-] Waiting on 1 children to exit Mar 12 13:40:37 edpm-compute-0 ovn_metadata_agent[28847]: 2026-03-12 13:40:37.435 28853 INFO oslo_service.service [-] Child 28950 exited with status 0 Mar 12 13:40:37 edpm-compute-0 conmon[28847]: conmon a52de2bfc6b9d8a4418c : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a52de2bfc6b9d8a4418cf88a3621be26a10061f75bb4aa27c84b249218e15b24.scope/container/memory.events Mar 12 13:40:38 edpm-compute-0 podman[174656]: Error: no container with ID a52de2bfc6b9d8a4418cf88a3621be26a10061f75bb4aa27c84b249218e15b24 found in database: no such container Mar 12 13:40:38 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 12 13:40:38 edpm-compute-0 podman[174671]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Mar 12 13:40:38 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 12 13:40:38 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Mar 12 13:40:38 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Mar 12 13:40:38 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Mar 12 13:40:38 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Mar 12 13:40:38 edpm-compute-0 edpm-start-podman-container[174701]: ovn_metadata_agent Mar 12 13:40:38 edpm-compute-0 edpm-start-podman-container[174699]: Creating additional drop-in dependency for "ovn_metadata_agent" (5730bb887d13e49a4714f062167cdc40b351f7d12495eadb3276e6fd07738508) Mar 12 13:40:39 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.