Feb 26 01:43:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:43:23.730 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:43:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:43:23.730 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:43:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:43:23.730 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:44:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:44:23.731 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:44:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:44:23.731 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:44:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:44:23.731 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:45:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:45:23.731 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:45:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:45:23.732 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:45:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:45:23.732 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:46:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:46:23.732 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:46:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:46:23.733 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:46:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:46:23.733 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:47:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:47:23.734 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:47:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:47:23.735 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:47:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:47:23.735 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:48:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:48:23.736 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:48:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:48:23.736 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:48:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:48:23.737 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:49:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:49:23.738 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:49:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:49:23.739 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:49:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:49:23.739 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:50:09 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:09.592 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 01:50:09 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:09.593 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 01:50:10 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:10.626 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:10 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:10.631 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:11.628 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:50:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:11.628 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 01:50:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:11.633 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:50:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:11.633 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 01:50:13 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:13.644 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:13 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:13.644 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:15.645 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:50:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:15.645 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 01:50:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:15.647 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:50:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:15.647 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 01:50:19 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:19.652 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:19 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:19.659 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:50:19 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:19.662 28941 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 01:50:19 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:19.662 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 01:50:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:20.662 28847 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 01:50:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:20.663 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 01:50:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:23.739 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:50:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:23.739 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:50:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:23.740 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:50:27 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:27.678 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 01:50:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:28.694 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 01:50:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:28.707 28847 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:74:17', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'd6:4b:ae:66:fe:1b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 01:50:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:28.707 28847 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 01:50:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:50:28.708 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a0fd9056-9920-421c-a580-3c9a9c6647c0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 01:51:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:51:23.740 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:51:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:51:23.740 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:51:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:51:23.741 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:52:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:52:23.741 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:52:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:52:23.742 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:52:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:52:23.742 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:53:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:53:23.742 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:53:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:53:23.743 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:53:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:53:23.743 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:54:23.744 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:54:23.745 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:54:23.745 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:55:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:55:23.745 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:55:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:55:23.745 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:55:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:55:23.745 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:56:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:23.746 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:56:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:23.746 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:56:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:23.747 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:56:41 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:41.163 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 01:56:41 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:41.165 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 01:56:42 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:42.222 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:42 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:42.223 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:43 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:43.223 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:56:43 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:43.224 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 01:56:43 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:43.225 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:56:43 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:43.226 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 01:56:45 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:45.248 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:45 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:45.250 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:47 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:47.250 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:56:47 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:47.250 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 01:56:47 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:47.253 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 01:56:47 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:47.253 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.340 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.343 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.343 28941 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.344 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.346 28847 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 01:56:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:56:51.346 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 01:57:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:57:23.747 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:57:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:57:23.748 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:57:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:57:23.748 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.421 28847 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.428 28847 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:74:17', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'd6:4b:ae:66:fe:1b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.429 28847 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.431 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a0fd9056-9920-421c-a580-3c9a9c6647c0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.431 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.453 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.577 28847 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:74:17', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'd6:4b:ae:66:fe:1b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.578 28847 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.618 28847 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:74:17', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'd6:4b:ae:66:fe:1b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 01:58:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:11.619 28847 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 01:58:12 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:12.622 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a0fd9056-9920-421c-a580-3c9a9c6647c0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 01:58:12 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:12.623 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 01:58:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:18.582 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a0fd9056-9920-421c-a580-3c9a9c6647c0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 01:58:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:18.583 28847 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 01:58:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:23.748 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:58:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:23.749 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:58:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:58:23.749 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 01:59:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:59:23.749 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 01:59:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:59:23.749 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 01:59:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 01:59:23.750 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 02:00:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:00:23.750 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 02:00:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:00:23.751 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 02:00:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:00:23.751 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 02:01:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:01:23.750 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 02:01:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:01:23.751 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 02:01:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:01:23.751 28847 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.322 28941 INFO eventlet.wsgi.server [-] (28941) wsgi exited, is_accepting=True Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.325 28941 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.325 28941 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.325 28941 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.330 28847 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.751 28847 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.751 28847 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 26 02:02:18 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-26 02:02:18.752 28847 INFO oslo_service.service [-] Child 28941 exited with status 0 Feb 26 02:02:21 edpm-compute-0 podman[174436]: Error: no container with ID dcaf4efb52c1f2a5f8b1314bca52a5ff07e5fc0f455c6876bbaa0bafa4de2590 found in database: no such container Feb 26 02:02:21 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 02:02:21 edpm-compute-0 podman[174451]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 26 02:02:21 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 02:02:21 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 26 02:02:21 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 26 02:02:21 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 26 02:02:21 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 26 02:02:21 edpm-compute-0 edpm-start-podman-container[174482]: ovn_metadata_agent Feb 26 02:02:21 edpm-compute-0 edpm-start-podman-container[174481]: Creating additional drop-in dependency for "ovn_metadata_agent" (92d12722b31c9cf6244b9692fe4a57b6fa85996e121f94cfc5f853cee1091187) Feb 26 02:02:22 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.