Mar 17 18:20:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:20:23.247 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:20:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:20:23.247 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:20:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:20:23.248 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:21:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:21:23.248 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:21:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:21:23.249 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:21:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:21:23.249 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:22:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:22:23.250 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:22:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:22:23.251 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:22:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:22:23.251 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:23:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:23:23.252 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:23:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:23:23.253 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:23:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:23:23.253 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:24:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:24:23.254 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:24:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:24:23.254 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:24:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:24:23.254 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:25:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:14.972 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 17 18:25:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:14.974 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.075 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.076 28839 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.076 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.078 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.079 28959 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 17 18:25:16 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:16.079 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 17 18:25:18 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:18.339 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:18 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:18.341 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:20 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:20.343 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:25:20 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:20.343 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:25:20 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:20.343 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 17 18:25:20 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:20.344 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 17 18:25:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:23.255 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:25:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:23.255 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:25:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:23.255 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:25:24 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:24.352 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:24 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:24.352 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:25:28 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:28.211 28959 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Mar 17 18:25:28 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:28.211 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 17 18:25:28 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:28.211 28839 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Mar 17 18:25:28 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:25:28.212 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 17 18:26:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:08.569 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 17 18:26:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:08.580 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 17 18:26:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:08.601 28839 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9e:e5:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:3c:b0:04:10:9f'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 17 18:26:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:08.603 28839 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 17 18:26:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:08.604 28839 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b2e4282d-154e-44cd-af5a-627a95eebd27, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 17 18:26:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:23.255 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:26:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:23.256 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:26:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:26:23.256 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:27:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:27:23.256 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:27:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:27:23.257 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:27:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:27:23.257 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:28:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:28:23.257 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:28:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:28:23.258 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:28:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:28:23.259 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:29:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:29:23.258 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:29:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:29:23.260 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:29:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:29:23.260 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:30:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:30:23.260 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:30:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:30:23.260 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:30:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:30:23.261 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:31:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:31:23.262 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:31:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:31:23.262 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:31:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:31:23.262 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:32:04 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:04.621 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 17 18:32:04 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:04.622 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 17 18:32:05 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:05.702 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:05 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:05.742 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:06.704 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:32:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:06.704 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 17 18:32:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:06.744 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:32:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:06.744 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 17 18:32:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:08.714 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:08 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:08.783 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:10 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:10.715 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:32:10 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:10.716 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 17 18:32:10 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:10.786 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 17 18:32:10 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:10.787 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.722 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.725 28839 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.725 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.800 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.802 28959 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 17 18:32:14 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:14.802 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 17 18:32:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:23.263 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:32:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:23.263 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:32:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:32:23.264 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:33:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:23.264 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:33:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:23.265 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:33:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:23.265 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:33:58 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:58.870 28839 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 17 18:33:58 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:58.903 28839 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9e:e5:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '82:3c:b0:04:10:9f'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 17 18:33:58 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:58.905 28839 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 17 18:33:58 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:33:58.972 28959 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 17 18:34:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:34:06.907 28839 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b2e4282d-154e-44cd-af5a-627a95eebd27, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 17 18:34:06 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:34:06.908 28839 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Mar 17 18:34:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:34:23.266 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:34:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:34:23.267 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:34:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:34:23.267 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:35:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:35:23.268 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:35:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:35:23.268 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:35:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:35:23.269 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:36:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:36:23.270 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:36:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:36:23.270 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:36:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:36:23.270 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:37:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:23.270 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 17 18:37:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:23.271 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 17 18:37:23 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:23.271 28839 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 17 18:37:31 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:31.991 28959 INFO eventlet.wsgi.server [-] (28959) wsgi exited, is_accepting=True Mar 17 18:37:31 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:31.992 28959 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 17 18:37:31 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:31.992 28959 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 17 18:37:31 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:31.992 28959 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.000 28839 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 INFO oslo_service.service [-] Caught SIGTERM, stopping children Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.865 28839 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.866 28839 INFO oslo_service.service [-] Waiting on 1 children to exit Mar 17 18:37:32 edpm-compute-0 ovn_metadata_agent[28834]: 2026-03-17 18:37:32.866 28839 INFO oslo_service.service [-] Child 28959 exited with status 0 Mar 17 18:37:36 edpm-compute-0 podman[174455]: Error: no container with ID e1317451e5e91910e74afc432e237ae6fa886d49012fbccc905d4ef9a9ee733d found in database: no such container Mar 17 18:37:36 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 17 18:37:36 edpm-compute-0 podman[174469]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Mar 17 18:37:36 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 17 18:37:36 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Mar 17 18:37:36 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Mar 17 18:37:36 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Mar 17 18:37:36 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Mar 17 18:37:36 edpm-compute-0 edpm-start-podman-container[174501]: ovn_metadata_agent Mar 17 18:37:37 edpm-compute-0 edpm-start-podman-container[174498]: Creating additional drop-in dependency for "ovn_metadata_agent" (25a66b4a6a087f52cdf70784d0db29b3e407293e3449033c093957b650669346) Mar 17 18:37:37 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.