Feb 23 16:28:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:28:55.079 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:28:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:28:55.080 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:28:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:28:55.080 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:29:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:29:55.080 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:29:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:29:55.080 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:29:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:29:55.080 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:30:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:30:55.082 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:30:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:30:55.082 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:30:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:30:55.082 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:31:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:31:55.082 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:31:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:31:55.083 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:31:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:31:55.083 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:32:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:32:55.084 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:32:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:32:55.084 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:32:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:32:55.084 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:33:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:33:55.086 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:33:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:33:55.086 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:33:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:33:55.086 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:34:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:34:55.088 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:34:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:34:55.088 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:34:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:34:55.088 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:35:28 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:28.253 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 16:35:28 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:28.256 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.344 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.344 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.346 28827 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.346 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.348 28922 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 16:35:29 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:29.348 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 16:35:31 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:31.387 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:31 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:31.391 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:33 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:33.389 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:35:33 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:33.389 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 16:35:33 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:33.393 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:35:33 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:33.393 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 16:35:37 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:37.411 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:37 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:37.411 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:35:41 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:41.413 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:35:41 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:41.413 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 16:35:41 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:41.413 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:35:41 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:41.413 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 16:35:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:55.089 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:35:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:55.089 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:35:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:55.089 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:35:57 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:57.710 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 16:35:57 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:57.712 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 16:35:57 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:57.718 28827 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:5e:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:4b:5e:e5:71:b7'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 16:35:57 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:57.719 28827 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 16:35:57 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:35:57.720 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=52695af2-02b6-48ea-bc06-641d25b720e1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 16:36:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:36:55.090 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:36:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:36:55.091 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:36:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:36:55.092 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:37:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:37:55.091 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:37:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:37:55.092 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:37:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:37:55.092 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:38:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:38:55.092 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:38:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:38:55.092 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:38:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:38:55.093 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:39:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:39:55.093 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:39:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:39:55.093 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:39:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:39:55.093 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:40:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:40:55.095 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:40:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:40:55.095 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:40:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:40:55.096 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:41:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:41:55.096 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:41:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:41:55.096 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:41:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:41:55.096 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:42:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:42:55.097 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:42:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:42:55.097 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:42:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:42:55.098 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:43:12 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:12.289 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 16:43:12 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:12.290 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 16:43:13 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:13.344 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:13 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:13.346 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:14 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:14.346 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:43:14 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:14.346 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 16:43:14 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:14.347 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:43:14 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:14.348 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 16:43:16 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:16.354 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:16 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:16.357 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:18 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:18.355 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:43:18 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:18.355 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 16:43:18 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:18.359 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 16:43:18 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:18.360 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.369 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.371 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.373 28922 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.374 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.374 28827 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 16:43:22 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:22.374 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 16:43:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:55.097 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:43:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:55.098 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:43:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:43:55.098 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:44:42 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:42.476 28827 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 16:44:42 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:42.496 28922 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.199 28827 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:5e:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:4b:5e:e5:71:b7'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.201 28827 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.271 28827 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:5e:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:4b:5e:e5:71:b7'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.272 28827 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.274 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=52695af2-02b6-48ea-bc06-641d25b720e1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.274 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.311 28827 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '5a:5e:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:4b:5e:e5:71:b7'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 16:44:43 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:43.312 28827 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 16:44:48 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:48.315 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=52695af2-02b6-48ea-bc06-641d25b720e1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 16:44:48 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:48.316 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 23 16:44:50 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:50.205 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=52695af2-02b6-48ea-bc06-641d25b720e1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 16:44:50 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:50.206 28827 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 23 16:44:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:55.099 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:44:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:55.099 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:44:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:44:55.100 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:45:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:45:55.099 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:45:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:45:55.100 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:45:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:45:55.100 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:46:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:46:55.101 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:46:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:46:55.102 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:46:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:46:55.102 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:47:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:47:55.103 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 16:47:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:47:55.104 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 16:47:55 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:47:55.104 28827 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 16:48:39 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:39.298 28922 INFO eventlet.wsgi.server [-] (28922) wsgi exited, is_accepting=True Feb 23 16:48:39 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:39.299 28922 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 16:48:39 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:39.299 28922 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 16:48:39 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:39.299 28827 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 23 16:48:39 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:39.299 28922 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.279 28827 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 23 16:48:40 edpm-compute-0 ovn_metadata_agent[28822]: 2026-02-23 16:48:40.280 28827 INFO oslo_service.service [-] Child 28922 exited with status 0 Feb 23 16:48:40 edpm-compute-0 conmon[28822]: conmon dfcbc97245037bd664b3 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-dfcbc97245037bd664b3e6b2d8a5143ecd0f534c7185c62fe0bc4ee279f27506.scope/container/memory.events Feb 23 16:48:43 edpm-compute-0 podman[174557]: Error: no container with ID dfcbc97245037bd664b3e6b2d8a5143ecd0f534c7185c62fe0bc4ee279f27506 found in database: no such container Feb 23 16:48:43 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 23 16:48:44 edpm-compute-0 podman[174572]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 23 16:48:44 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 23 16:48:44 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 23 16:48:44 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 23 16:48:44 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 23 16:48:44 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 23 16:48:44 edpm-compute-0 edpm-start-podman-container[174604]: ovn_metadata_agent Feb 23 16:48:44 edpm-compute-0 edpm-start-podman-container[174601]: Creating additional drop-in dependency for "ovn_metadata_agent" (49d520bac095535683135f0b6b8426a44d8b6bf0a3d2ba2e485e2aab1555060f) Feb 23 16:48:45 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.