Feb 23 09:55:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:55:47.104 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 09:55:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:55:47.105 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 09:55:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:55:47.105 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 09:56:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:56:47.105 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 09:56:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:56:47.107 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 09:56:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:56:47.107 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 09:57:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:57:47.107 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 09:57:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:57:47.108 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 09:57:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:57:47.108 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 09:58:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:58:47.109 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 09:58:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:58:47.109 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 09:58:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:58:47.110 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 09:59:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:59:47.111 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 09:59:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:59:47.111 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 09:59:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 09:59:47.112 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:00:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:00:47.112 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:00:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:00:47.112 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:00:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:00:47.113 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:01:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:01:47.114 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:01:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:01:47.114 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:01:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:01:47.114 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:02:28 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:28.498 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 10:02:28 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:28.500 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 10:02:29 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:29.550 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:29 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:29.552 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:30 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:30.552 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:30 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:30.552 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 10:02:30 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:30.553 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:30 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:30.553 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 10:02:32 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:32.561 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:32 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:32.569 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:34 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:34.564 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:34 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:34.564 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 10:02:34 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:34.571 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:34 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:34.571 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 10:02:38 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:38.570 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:38 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:38.581 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:02:42 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:42.572 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:42 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:42.572 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 10:02:42 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:42.585 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:02:42 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:42.585 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 10:02:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:47.114 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:02:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:47.115 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:02:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:02:47.115 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:03:22 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:22.821 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 10:03:22 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:22.962 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 10:03:22 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:22.969 28843 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ba:ab:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:6e:8a:d9:73:06'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 10:03:22 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:22.970 28843 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 10:03:22 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:22.971 28843 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=04fc6220-f201-48b3-a8bf-bb086c4a7315, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 10:03:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:47.116 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:03:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:47.117 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:03:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:03:47.117 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:04:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:04:47.117 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:04:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:04:47.117 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:04:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:04:47.117 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:05:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:05:47.118 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:05:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:05:47.118 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:05:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:05:47.118 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:06:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:06:47.119 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:06:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:06:47.120 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:06:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:06:47.120 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:07:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:07:47.120 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:07:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:07:47.121 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:07:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:07:47.121 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:08:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:47.121 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:08:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:47.122 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:08:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:47.122 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:08:54 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:54.245 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 10:08:54 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:54.247 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Feb 23 10:08:55 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:55.261 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:08:55 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:55.261 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:08:56 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:56.262 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:08:56 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:56.262 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 10:08:56 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:56.263 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:08:56 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:56.263 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Feb 23 10:08:58 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:58.270 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:08:58 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:08:58.270 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:09:00 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:00.270 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:09:00 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:00.271 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 10:09:00 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:00.272 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Feb 23 10:09:00 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:00.273 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.276 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.280 28843 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.280 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.283 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.284 28939 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Feb 23 10:09:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:04.284 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Feb 23 10:09:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:47.122 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:09:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:47.122 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:09:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:09:47.123 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:10:16 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:16.375 28843 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 10:10:16 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:16.391 28843 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ba:ab:d8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:6e:8a:d9:73:06'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Feb 23 10:10:16 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:16.392 28843 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Feb 23 10:10:16 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:16.397 28939 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 10:10:20 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:20.394 28843 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=04fc6220-f201-48b3-a8bf-bb086c4a7315, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Feb 23 10:10:20 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:20.395 28843 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Feb 23 10:10:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:10:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:10:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:10:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:11:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:11:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:11:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:11:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:11:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:11:47.124 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:12:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:12:47.125 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:12:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:12:47.126 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:12:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:12:47.126 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:13:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:13:47.125 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Feb 23 10:13:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:13:47.126 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Feb 23 10:13:47 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:13:47.126 28843 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.176 28939 INFO eventlet.wsgi.server [-] (28939) wsgi exited, is_accepting=True Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.177 28939 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.177 28939 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.177 28939 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.179 28843 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.629 28843 INFO oslo_service.service [-] Caught SIGTERM, stopping children Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.630 28843 INFO oslo_service.service [-] Waiting on 1 children to exit Feb 23 10:14:04 edpm-compute-0 ovn_metadata_agent[28838]: 2026-02-23 10:14:04.631 28843 INFO oslo_service.service [-] Child 28939 exited with status 0 Feb 23 10:14:08 edpm-compute-0 podman[174169]: ovn_metadata_agent Feb 23 10:14:08 edpm-compute-0 podman[174185]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Feb 23 10:14:08 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Feb 23 10:14:08 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Feb 23 10:14:09 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Feb 23 10:14:09 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Feb 23 10:14:09 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Feb 23 10:14:09 edpm-compute-0 edpm-start-podman-container[174212]: ovn_metadata_agent Feb 23 10:14:09 edpm-compute-0 edpm-start-podman-container[174210]: Creating additional drop-in dependency for "ovn_metadata_agent" (79010a47a083cccefa8a2f8a523d54c1568828025733a22da604843ba183ebff) Feb 23 10:14:09 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.