Feb 24 04:10:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:10:45.652 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:10:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:10:45.652 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:10:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:10:45.652 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:11:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:11:45.653 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:11:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:11:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:11:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:11:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:12:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:12:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:12:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:12:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:12:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:12:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:13:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:13:45.655 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:13:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:13:45.656 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:13:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:13:45.656 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:14:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:14:45.656 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:14:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:14:45.657 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:14:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:14:45.657 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:15:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:15:45.657 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:15:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:15:45.657 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:15:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:15:45.658 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:16:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:16:45.658 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:16:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:16:45.658 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:16:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:16:45.659 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:17:38 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:38.330 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 24 04:17:38 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:38.332 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 24 04:17:39 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:39.372 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:39 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:39.375 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:40.374 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:40.375 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 24 04:17:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:40.376 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:40 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:40.376 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 24 04:17:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:42.382 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:42.384 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:44.385 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:44.385 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 24 04:17:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:44.386 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:44 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:44.386 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 24 04:17:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:45.660 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:17:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:45.660 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:17:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:45.660 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:17:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:48.394 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:48 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:48.395 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:17:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:52.397 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:52.397 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 24 04:17:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:52.398 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:17:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:17:52.398 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 24 04:18:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:15.582 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 24 04:18:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:15.600 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 24 04:18:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:15.611 28854 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '8a:33:26', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '5a:01:fe:5a:30:8a'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 24 04:18:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:15.613 28854 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 24 04:18:15 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:15.614 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90e1fd23-41d7-4d5a-8723-602357aa34aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 24 04:18:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:45.661 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:18:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:45.663 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:18:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:18:45.663 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:19:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:19:45.663 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:19:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:19:45.664 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:19:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:19:45.665 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:20:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:20:45.665 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:20:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:20:45.665 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:20:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:20:45.665 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:21:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:21:45.666 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:21:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:21:45.667 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:21:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:21:45.667 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:22:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:22:45.667 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:22:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:22:45.668 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:22:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:22:45.668 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:23:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:23:45.671 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:23:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:23:45.671 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:23:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:23:45.672 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:24:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:24:45.672 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:24:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:24:45.673 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:24:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:24:45.673 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:25:12 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:12.141 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 24 04:25:12 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:12.142 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 24 04:25:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:13.177 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:13 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:13.177 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:14 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:14.178 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:25:14 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:14.178 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 24 04:25:14 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:14.178 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:25:14 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:14.179 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 24 04:25:16 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:16.208 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:16 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:16.209 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:18 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:18.211 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:25:18 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:18.211 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 24 04:25:18 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:18.211 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 24 04:25:18 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:18.211 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.334 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.337 28950 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.337 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.359 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.360 28854 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 24 04:25:22 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:22.360 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 24 04:25:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:45.674 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:25:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:45.674 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:25:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:25:45.675 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:26:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:42.453 28854 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 24 04:26:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:42.459 28950 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 24 04:26:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:42.461 28854 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '8a:33:26', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '5a:01:fe:5a:30:8a'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 24 04:26:42 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:42.462 28854 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 24 04:26:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:45.676 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:26:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:45.676 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:26:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:45.676 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:26:49 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:49.464 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=90e1fd23-41d7-4d5a-8723-602357aa34aa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 24 04:26:49 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:26:49.464 28854 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 24 04:27:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:27:45.676 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:27:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:27:45.677 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:27:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:27:45.677 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:28:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:28:45.678 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:28:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:28:45.678 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:28:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:28:45.679 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:29:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:29:45.678 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:29:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:29:45.679 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:29:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:29:45.679 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:30:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:45.679 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 24 04:30:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:45.680 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 24 04:30:45 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:45.680 28854 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 24 04:30:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:52.928 28950 INFO eventlet.wsgi.server [-] (28950) wsgi exited, is_accepting=True Feb 24 04:30:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:52.929 28950 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 24 04:30:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:52.929 28950 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 24 04:30:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:52.929 28950 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 24 04:30:52 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:52.931 28854 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.502 28854 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.502 28854 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.503 28854 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.504 28854 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.504 28854 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.505 28854 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.505 28854 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 24 04:30:53 edpm-compute-0 ovn_metadata_agent[28849]: 2026-02-24 04:30:53.505 28854 INFO oslo_service.service [-] Child 28950 exited with status 0 Feb 24 04:30:57 edpm-compute-0 podman[174806]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 24 04:30:57 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 24 04:30:57 edpm-compute-0 podman[174821]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 24 04:30:57 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 24 04:30:57 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 24 04:30:57 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 24 04:30:57 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 24 04:30:57 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 24 04:30:57 edpm-compute-0 edpm-start-podman-container[174846]: ovn_metadata_agent Feb 24 04:30:58 edpm-compute-0 edpm-start-podman-container[174844]: Creating additional drop-in dependency for "ovn_metadata_agent" (5961392d8348b6c75c1f9875c128941bd36e2b53e8de0949a995f6c50b0cf1fc) Feb 24 04:30:58 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.