Feb 23 22:33:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:33:04.940 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:33:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:33:04.941 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:33:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:33:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:34:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:34:04.941 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:34:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:34:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:34:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:34:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:35:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:35:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:35:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:35:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:35:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:35:04.942 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:36:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:36:04.943 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:36:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:36:04.944 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:36:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:36:04.944 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:37:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:37:04.944 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:37:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:37:04.945 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:37:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:37:04.945 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:38:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:04.944 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:38:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:04.945 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:38:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:04.945 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:38:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:50.859 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 22:38:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:50.859 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 22:38:51 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:51.872 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:38:51 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:51.872 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:38:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:52.874 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:38:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:52.874 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 22:38:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:52.875 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:38:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:52.875 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 22:38:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:54.881 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:38:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:54.881 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:38:56 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:56.882 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:38:56 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:56.883 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 22:38:56 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:56.884 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:38:56 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:38:56.884 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 22:39:00 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:00.896 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:39:00 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:00.897 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.898 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.898 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.900 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.901 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.945 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.946 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:39:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:04.946 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:39:20 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:20.932 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 22:39:20 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:20.935 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 22:39:20 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:20.946 28855 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:25:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:48:09:c3:bd:9e'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 22:39:20 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:20.947 28855 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 22:39:20 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:39:20.948 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=07e82b07-93fb-4b38-9a67-16ac1bdc24a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 22:40:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:40:04.946 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:40:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:40:04.947 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:40:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:40:04.947 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:41:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:41:04.948 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:41:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:41:04.948 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:41:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:41:04.948 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:42:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:42:04.949 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:42:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:42:04.949 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:42:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:42:04.950 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:43:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:43:04.950 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:43:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:43:04.950 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:43:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:43:04.950 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:44:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:44:04.951 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:44:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:44:04.952 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:44:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:44:04.952 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:45:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:04.953 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:45:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:04.954 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:45:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:04.954 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:45:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:43.946 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 22:45:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:43.948 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 22:45:44 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:44.955 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:45 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:45.627 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:45 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:45.957 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:45:45 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:45.957 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 22:45:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:46.629 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:45:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:46.630 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 22:45:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:48.019 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:48.723 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:50.021 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:45:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:50.021 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 22:45:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:50.725 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 22:45:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:50.725 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.026 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.028 28855 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.028 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.742 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.743 28971 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 22:45:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:45:54.743 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 22:46:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:46:04.954 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:46:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:46:04.955 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:46:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:46:04.955 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:47:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:04.955 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:47:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:04.956 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:47:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:04.962 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.108 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.126 28855 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:25:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:48:09:c3:bd:9e'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.127 28855 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.130 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=07e82b07-93fb-4b38-9a67-16ac1bdc24a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.130 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 23 22:47:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:47:14.879 28971 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 22:48:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:48:04.957 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:48:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:48:04.957 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:48:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:48:04.958 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:49:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:49:04.958 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:49:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:49:04.960 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:49:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:49:04.960 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:50:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:04.959 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 22:50:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:04.959 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 22:50:04 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:04.959 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.372 28971 INFO eventlet.wsgi.server [-] (28971) wsgi exited, is_accepting=True Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.372 28971 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.372 28971 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.372 28971 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.380 28855 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.381 28855 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.382 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.383 28855 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.383 28855 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.383 28855 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.384 28855 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.384 28855 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 23 22:50:49 edpm-compute-0 ovn_metadata_agent[28850]: 2026-02-23 22:50:49.385 28855 INFO oslo_service.service [-] Child 28971 exited with status 0 Feb 23 22:50:49 edpm-compute-0 conmon[28850]: conmon 0da0265eea30842ad6c0 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0da0265eea30842ad6c0f03e7f6df037f0f1831ee4f38dd522da462fb2cb01d1.scope/container/memory.events Feb 23 22:50:51 edpm-compute-0 podman[174117]: 2026-02-23 22:50:51.040922173 +0000 UTC m=+1.566892499 container cleanup 0da0265eea30842ad6c0f03e7f6df037f0f1831ee4f38dd522da462fb2cb01d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 22:50:51 edpm-compute-0 podman[174130]: Error: no container with ID 0da0265eea30842ad6c0f03e7f6df037f0f1831ee4f38dd522da462fb2cb01d1 found in database: no such container Feb 23 22:50:51 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 23 22:50:51 edpm-compute-0 podman[174146]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 23 22:50:51 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 23 22:50:51 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 23 22:50:51 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 23 22:50:51 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 23 22:50:51 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 23 22:50:51 edpm-compute-0 edpm-start-podman-container[174171]: ovn_metadata_agent Feb 23 22:50:51 edpm-compute-0 edpm-start-podman-container[174170]: Creating additional drop-in dependency for "ovn_metadata_agent" (3fb51257aecbb358179e6d615b70da9404922aaff83d092e3863a2b4b2f9f63f) Feb 23 22:50:52 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.