Feb 27 09:47:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:47:39.549 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:47:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:47:39.549 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:47:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:47:39.550 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:48:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:48:39.550 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:48:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:48:39.550 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:48:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:48:39.551 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:49:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:49:39.551 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:49:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:49:39.552 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:49:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:49:39.552 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:50:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:50:39.552 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:50:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:50:39.553 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:50:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:50:39.553 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:51:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:51:39.553 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:51:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:51:39.553 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:51:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:51:39.553 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:52:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:52:39.555 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:52:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:52:39.555 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:52:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:52:39.555 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:53:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:53:39.557 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:53:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:53:39.557 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:53:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:53:39.557 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:54:09 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:09.906 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 27 09:54:09 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:09.907 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 27 09:54:10 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:10.945 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:10 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:10.946 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:11.946 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 09:54:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:11.946 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 27 09:54:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:11.947 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 09:54:11 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:11.948 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 27 09:54:13 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:13.994 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:14 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:14.006 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:15.995 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 09:54:15 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:15.996 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 27 09:54:16 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:16.008 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 09:54:16 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:16.008 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 27 09:54:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:20.026 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:20.026 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 09:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:23.122 28942 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Feb 27 09:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:23.122 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 27 09:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:23.123 28848 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Feb 27 09:54:23 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:23.123 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 27 09:54:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:39.558 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:54:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:39.559 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:54:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:39.559 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:54:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:50.354 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 27 09:54:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:50.389 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 27 09:54:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:50.426 28848 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:3c:26', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '2a:a7:6c:88:ae:c4'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 27 09:54:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:50.427 28848 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 27 09:54:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:54:50.428 28848 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3d514f85-490d-43ab-bdb2-5ee95d2a5803, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 27 09:55:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:55:39.560 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:55:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:55:39.561 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:55:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:55:39.561 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:56:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:56:39.562 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:56:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:56:39.562 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:56:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:56:39.563 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:57:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:57:39.563 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:57:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:57:39.564 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:57:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:57:39.564 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:58:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:58:39.565 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:58:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:58:39.566 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:58:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:58:39.566 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 09:59:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:59:39.566 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 09:59:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:59:39.568 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 09:59:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 09:59:39.568 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:00:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:00:39.568 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:00:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:00:39.568 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:00:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:00:39.568 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:01:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:39.569 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:01:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:39.570 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:01:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:39.570 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:01:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:50.724 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 27 10:01:50 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:50.725 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 27 10:01:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:51.798 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:01:51 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:51.809 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:01:52 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:52.800 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 10:01:52 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:52.800 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 27 10:01:52 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:52.812 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 10:01:52 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:52.812 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 27 10:01:54 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:54.807 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:01:54 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:54.820 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:01:56 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:56.809 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 10:01:56 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:56.809 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 27 10:01:56 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:56.823 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 27 10:01:56 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:01:56.824 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.823 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.825 28848 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.825 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.832 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.833 28942 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 27 10:02:00 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:00.833 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 27 10:02:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:39.570 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:02:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:39.570 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:02:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:02:39.570 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:03:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:28.932 28848 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 27 10:03:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:28.940 28848 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '56:3c:26', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '2a:a7:6c:88:ae:c4'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 27 10:03:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:28.941 28848 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 27 10:03:28 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:28.987 28942 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 27 10:03:37 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:37.944 28848 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3d514f85-490d-43ab-bdb2-5ee95d2a5803, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 27 10:03:37 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:37.944 28848 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 27 10:03:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:39.571 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:03:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:39.571 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:03:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:03:39.572 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:04:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:04:39.572 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:04:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:04:39.573 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:04:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:04:39.573 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:05:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:05:39.573 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:05:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:05:39.574 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:05:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:05:39.575 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:06:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:06:39.574 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 27 10:06:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:06:39.574 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 27 10:06:39 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:06:39.575 28848 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.501 28942 INFO eventlet.wsgi.server [-] (28942) wsgi exited, is_accepting=True Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.501 28942 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.502 28942 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.502 28848 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.502 28942 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.683 28848 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.685 28848 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.685 28848 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.685 28848 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.685 28848 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.685 28848 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.686 28848 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 27 10:07:20 edpm-compute-0 ovn_metadata_agent[28842]: 2026-02-27 10:07:20.686 28848 INFO oslo_service.service [-] Child 28942 exited with status 0 Feb 27 10:07:24 edpm-compute-0 podman[174783]: Error: no container with ID a12d45740afd2e7ceaa99727850e19f4062ba0de68b590a734703b3c780fd040 found in database: no such container Feb 27 10:07:24 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 27 10:07:24 edpm-compute-0 podman[174800]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 27 10:07:24 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 27 10:07:24 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 27 10:07:24 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 27 10:07:24 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 27 10:07:24 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 27 10:07:24 edpm-compute-0 edpm-start-podman-container[174828]: ovn_metadata_agent Feb 27 10:07:24 edpm-compute-0 edpm-start-podman-container[174826]: Creating additional drop-in dependency for "ovn_metadata_agent" (1676f277f5b7577d81908a671cd02f7c8aee023998606bb801bd50bff5e572d5) Feb 27 10:07:25 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.