Feb 14 13:48:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:48:12.433 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:48:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:48:12.434 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:48:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:48:12.435 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:49:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:49:12.434 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:49:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:49:12.435 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:49:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:49:12.435 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:50:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:50:12.435 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:50:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:50:12.436 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:50:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:50:12.436 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:51:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:51:12.437 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:51:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:51:12.437 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:51:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:51:12.438 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:52:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:52:12.438 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:52:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:52:12.439 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:52:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:52:12.439 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:53:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:12.439 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:53:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:12.439 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:53:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:12.440 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:53:26 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:26.492 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 14 13:53:26 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:26.494 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.516 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.518 28672 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.519 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.526 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.527 28766 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:53:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:27.528 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.524 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.525 28672 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.525 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.536 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.538 28766 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:53:29 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:29.538 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 14 13:53:33 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:33.570 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:33 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:33.582 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:53:37 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:37.571 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:53:37 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:37.571 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 14 13:53:37 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:37.585 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:53:37 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:53:37.585 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.440 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.441 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.442 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.727 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.728 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.780 28672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:51:60', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ea:5e:62:35:67'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.785 28672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 14 13:54:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:54:12.787 28672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=474df7d3-3f2d-44d9-8b23-277325171589, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 14 13:55:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:55:12.442 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:55:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:55:12.443 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:55:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:55:12.443 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:56:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:56:12.443 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:56:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:56:12.443 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:56:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:56:12.444 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:57:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:57:12.445 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:57:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:57:12.446 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:57:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:57:12.446 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:58:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:58:12.446 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:58:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:58:12.447 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:58:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:58:12.447 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:59:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:12.448 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 13:59:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:12.448 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 13:59:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:12.449 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 13:59:17 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:17.688 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 14 13:59:17 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:17.689 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 14 13:59:18 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:18.720 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:18 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:18.720 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:19 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:19.721 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:59:19 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:19.721 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 14 13:59:19 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:19.722 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:59:19 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:19.722 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 14 13:59:21 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:21.731 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:21 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:21.735 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:23 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:23.733 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:59:23 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:23.733 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 14 13:59:23 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:23.736 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 14 13:59:23 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:23.737 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.755 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.755 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.758 28672 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.758 28766 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.758 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 14 13:59:27 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:27.758 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 14 13:59:35 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:35.839 28766 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 14 13:59:35 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:35.841 28672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 14 13:59:35 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:35.887 28672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:51:60', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1a:ea:5e:62:35:67'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 14 13:59:35 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:35.889 28672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 14 13:59:44 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:44.891 28672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=474df7d3-3f2d-44d9-8b23-277325171589, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 14 13:59:44 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 13:59:44.892 28672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 14 14:00:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:00:12.449 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 14:00:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:00:12.450 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 14:00:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:00:12.450 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 14:01:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:01:12.451 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 14:01:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:01:12.451 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 14:01:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:01:12.451 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 14:02:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:12.452 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 14 14:02:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:12.453 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 14 14:02:12 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:12.455 28672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 14 14:02:47 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:47.579 28766 INFO eventlet.wsgi.server [-] (28766) wsgi exited, is_accepting=True Feb 14 14:02:47 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:47.580 28766 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 14 14:02:47 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:47.580 28766 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 14 14:02:47 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:47.580 28766 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 14 14:02:47 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:47.582 28672 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.066 28672 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.067 28672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.067 28672 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.067 28672 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.067 28672 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.068 28672 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.068 28672 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 14 14:02:48 edpm-compute-0 ovn_metadata_agent[28667]: 2026-02-14 14:02:48.068 28672 INFO oslo_service.service [-] Child 28766 exited with status 0 Feb 14 14:02:48 edpm-compute-0 podman[172698]: ovn_metadata_agent Feb 14 14:02:48 edpm-compute-0 podman[172707]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 14 14:02:48 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 14 14:02:48 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 14 14:02:48 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 14 14:02:48 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 14 14:02:48 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 14 14:02:48 edpm-compute-0 edpm-start-podman-container[172734]: ovn_metadata_agent Feb 14 14:02:48 edpm-compute-0 edpm-start-podman-container[172733]: Creating additional drop-in dependency for "ovn_metadata_agent" (d8805fac0bc740d7bd061eb12f127e8a57bf2f8b22ce4a000ae4a85586373a32) Feb 14 14:02:49 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.