Feb 26 11:28:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:28:52.231 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:28:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:28:52.232 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:28:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:28:52.232 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:29:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:29:52.232 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:29:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:29:52.233 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:29:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:29:52.233 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:30:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:30:52.234 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:30:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:30:52.234 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:30:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:30:52.235 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:31:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:31:52.234 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:31:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:31:52.237 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:31:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:31:52.237 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:32:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:32:52.236 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:32:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:32:52.237 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:32:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:32:52.237 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:33:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:33:52.238 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:33:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:33:52.239 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:33:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:33:52.239 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:34:46 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:46.454 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 11:34:46 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:46.453 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.518 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.518 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.520 28951 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.521 28834 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.521 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 11:34:47 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:47.522 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.543 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.544 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.545 28951 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.545 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.545 28834 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:34:49 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:49.545 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 11:34:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:52.239 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:34:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:52.239 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:34:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:52.240 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:34:53 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:53.581 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:53 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:53.582 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:34:57 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:57.583 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:34:57 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:57.583 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 11:34:57 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:57.586 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:34:57 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:34:57.586 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 11:35:29 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:29.674 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 11:35:29 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:29.690 28834 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:12:59', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '5a:ee:35:ff:a4:4b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 11:35:29 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:29.691 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 11:35:29 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:29.694 28834 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 11:35:29 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:29.695 28834 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7fd15aa9-e7d5-4051-bc30-187ff3431cb7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 11:35:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:52.240 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:35:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:52.240 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:35:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:35:52.240 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:36:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:36:52.242 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:36:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:36:52.243 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:36:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:36:52.243 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:37:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:37:52.243 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:37:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:37:52.243 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:37:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:37:52.244 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:38:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:38:52.245 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:38:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:38:52.245 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:38:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:38:52.245 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:39:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:39:52.246 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:39:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:39:52.247 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:39:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:39:52.247 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:40:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:40:52.247 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:40:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:40:52.247 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:40:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:40:52.247 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:41:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:41:52.248 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:41:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:41:52.249 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:41:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:41:52.249 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:42:12 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:12.346 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 11:42:12 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:12.348 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 11:42:13 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:13.395 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:13 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:13.398 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:14 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:14.397 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:42:14 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:14.398 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 11:42:14 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:14.400 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:42:14 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:14.400 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 11:42:16 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:16.409 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:16 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:16.414 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:18 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:18.412 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:42:18 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:18.413 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 11:42:18 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:18.417 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 11:42:18 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:18.417 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.428 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.438 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.439 28951 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.439 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.439 28834 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 11:42:22 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:22.440 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 11:42:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:52.248 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:42:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:52.249 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:42:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:42:52.249 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:43:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:52.251 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:43:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:52.251 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:43:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:52.251 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:43:58 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:58.545 28834 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 11:43:58 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:58.574 28834 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:12:59', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '5a:ee:35:ff:a4:4b'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 11:43:58 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:58.578 28834 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 11:43:58 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:43:58.583 28951 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 11:44:00 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:44:00.580 28834 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7fd15aa9-e7d5-4051-bc30-187ff3431cb7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 11:44:00 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:44:00.581 28834 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 11:44:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:44:52.252 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:44:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:44:52.253 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:44:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:44:52.253 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:45:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:45:52.254 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:45:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:45:52.254 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:45:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:45:52.255 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:46:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:46:52.256 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 11:46:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:46:52.256 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 11:46:52 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:46:52.256 28834 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 11:47:37 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:37.610 28951 INFO eventlet.wsgi.server [-] (28951) wsgi exited, is_accepting=True Feb 26 11:47:37 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:37.612 28834 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 26 11:47:37 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:37.612 28951 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 11:47:37 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:37.613 28951 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 11:47:37 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:37.613 28951 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.547 28834 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.548 28834 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.548 28834 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.548 28834 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.548 28834 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.549 28834 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.549 28834 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 26 11:47:38 edpm-compute-0 ovn_metadata_agent[28829]: 2026-02-26 11:47:38.549 28834 INFO oslo_service.service [-] Child 28951 exited with status 0 Feb 26 11:47:38 edpm-compute-0 conmon[28829]: conmon 89344ef072b9f74a2cd1 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-89344ef072b9f74a2cd1475b35803c69b12a72eba66c217332e17109bb753c80.scope/container/memory.events Feb 26 11:47:43 edpm-compute-0 podman[174517]: Error: no container with ID 89344ef072b9f74a2cd1475b35803c69b12a72eba66c217332e17109bb753c80 found in database: no such container Feb 26 11:47:43 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 11:47:43 edpm-compute-0 podman[174534]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 26 11:47:43 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 11:47:43 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 26 11:47:43 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 26 11:47:43 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 26 11:47:43 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 26 11:47:43 edpm-compute-0 edpm-start-podman-container[174555]: ovn_metadata_agent Feb 26 11:47:44 edpm-compute-0 edpm-start-podman-container[174554]: Creating additional drop-in dependency for "ovn_metadata_agent" (caa32ff05af35cad1a6d2a59fb2762f2bfc3f005e6492a0607305e36ef267cca) Feb 26 11:47:44 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.