Feb 26 21:24:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:24:44.357 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:24:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:24:44.358 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:24:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:24:44.358 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:25:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:25:44.359 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:25:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:25:44.360 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:25:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:25:44.360 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:26:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:26:44.360 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:26:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:26:44.360 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:26:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:26:44.360 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:27:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:27:44.362 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:27:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:27:44.362 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:27:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:27:44.363 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:28:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:28:44.363 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:28:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:28:44.364 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:28:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:28:44.364 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:29:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:29:44.365 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:29:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:29:44.365 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:29:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:29:44.365 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:30:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:30:44.366 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:30:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:30:44.366 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:30:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:30:44.366 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:31:29 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:29.075 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 21:31:29 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:29.076 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.287 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.288 28941 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.288 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.289 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.291 28844 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:31:30 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:30.291 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 21:31:32 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:32.297 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:32 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:32.303 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:34 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:34.300 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:31:34 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:34.300 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 21:31:34 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:34.304 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:31:34 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:34.305 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 21:31:38 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:38.307 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:38 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:38.308 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:31:42 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:42.310 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:31:42 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:42.310 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 21:31:42 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:42.313 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:31:42 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:42.313 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 21:31:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:44.367 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:31:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:44.367 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:31:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:44.367 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:31:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:50.330 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 21:31:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:50.333 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 21:31:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:50.344 28844 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:f7:8e', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ee:eb:b3:7d:32'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 21:31:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:50.345 28844 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 21:31:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:31:50.346 28844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ae856db2-00c6-43e3-b146-119be49f4f53, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 21:32:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:32:44.369 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:32:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:32:44.369 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:32:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:32:44.369 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:33:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:33:44.370 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:33:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:33:44.371 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:33:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:33:44.371 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:34:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:34:44.372 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:34:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:34:44.373 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:34:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:34:44.373 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:35:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:35:44.373 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:35:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:35:44.373 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:35:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:35:44.373 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:36:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:36:44.374 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:36:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:36:44.375 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:36:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:36:44.375 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:37:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:37:44.376 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:37:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:37:44.377 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:37:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:37:44.377 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:38:40 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:40.612 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 21:38:40 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:40.614 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.694 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.695 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.696 28941 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.696 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.696 28844 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:38:41 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:41.697 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 26 21:38:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:44.034 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:44.036 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:44.378 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:38:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:44.378 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:38:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:44.379 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:38:46 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:46.036 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:38:46 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:46.036 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 21:38:46 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:46.038 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 26 21:38:46 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:46.038 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.074 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.074 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.077 28844 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.077 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.077 28941 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 26 21:38:50 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:38:50.077 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 26 21:39:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:39:44.378 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:39:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:39:44.379 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:39:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:39:44.379 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:40:26 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:26.223 28844 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 21:40:26 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:26.234 28941 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 26 21:40:26 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:26.237 28844 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:f7:8e', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:ee:eb:b3:7d:32'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 26 21:40:26 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:26.239 28844 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 26 21:40:31 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:31.241 28844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ae856db2-00c6-43e3-b146-119be49f4f53, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 26 21:40:31 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:31.242 28844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 26 21:40:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:44.380 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:40:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:44.380 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:40:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:40:44.381 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:41:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:41:44.380 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:41:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:41:44.381 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:41:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:41:44.381 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:42:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:42:44.381 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:42:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:42:44.382 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:42:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:42:44.383 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:43:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:43:44.383 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 26 21:43:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:43:44.383 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 26 21:43:44 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:43:44.383 28844 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.330 28941 INFO eventlet.wsgi.server [-] (28941) wsgi exited, is_accepting=True Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.330 28941 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.330 28941 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.331 28941 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.333 28844 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.457 28844 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 26 21:44:03 edpm-compute-0 ovn_metadata_agent[28839]: 2026-02-26 21:44:03.458 28844 INFO oslo_service.service [-] Child 28941 exited with status 0 Feb 26 21:44:05 edpm-compute-0 podman[174471]: Error: no container with ID 538b52d20d9b7fc71943bc1422563dd02520e8eafa5f5d9d2eb3251b4e58d6a7 found in database: no such container Feb 26 21:44:06 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 21:44:06 edpm-compute-0 podman[174495]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 26 21:44:06 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 26 21:44:06 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 26 21:44:06 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 26 21:44:06 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 26 21:44:06 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 26 21:44:06 edpm-compute-0 edpm-start-podman-container[174520]: ovn_metadata_agent Feb 26 21:44:06 edpm-compute-0 edpm-start-podman-container[174519]: Creating additional drop-in dependency for "ovn_metadata_agent" (2d8351b1a84aab873cf6a81dc8a0eade824c1ae1ad47809453334b34c27844f5) Feb 26 21:44:07 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.