Mar 20 18:26:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:26:19.322 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:26:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:26:19.324 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:26:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:26:19.324 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:27:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:27:19.324 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:27:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:27:19.325 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:27:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:27:19.325 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:28:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:28:19.325 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:28:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:28:19.325 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:28:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:28:19.325 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:29:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:29:19.326 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:29:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:29:19.327 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:29:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:29:19.327 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:30:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:30:19.327 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:30:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:30:19.327 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:30:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:30:19.327 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:31:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:31:19.328 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:31:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:31:19.329 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:31:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:31:19.329 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:32:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:32:19.330 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:32:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:32:19.330 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:32:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:32:19.330 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:33:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:33:19.331 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:33:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:33:19.332 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:33:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:33:19.332 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:34:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:34:19.334 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:34:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:34:19.334 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:34:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:34:19.334 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:35:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:35:19.335 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:35:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:35:19.336 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:35:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:35:19.336 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:36:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:19.337 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:36:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:19.337 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:36:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:19.337 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:36:44 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:44.115 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 20 18:36:44 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:44.117 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 20 18:36:45 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:45.164 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:45 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:45.165 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:46.165 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:36:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:46.166 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 20 18:36:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:46.167 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:36:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:46.167 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 20 18:36:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:48.179 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:48.181 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:50.182 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:36:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:50.182 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 20 18:36:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:50.183 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:36:50 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:50.184 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 20 18:36:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:54.263 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:54 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:54.265 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:36:57 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:57.347 28855 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Mar 20 18:36:57 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:57.348 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 20 18:36:57 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:57.347 28954 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (No route to host) Mar 20 18:36:57 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:36:57.348 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 20 18:37:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:19.338 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:37:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:19.339 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:37:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:19.339 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:37:40 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:40.267 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 20 18:37:40 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:40.279 28855 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:1d:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c9:e3:26:37:9c'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 20 18:37:40 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:40.280 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 20 18:37:40 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:40.281 28855 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 20 18:37:40 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:37:40.282 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c54dcd3-ca2c-4aa5-86de-4794b9e11ad2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 20 18:38:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:38:19.339 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:38:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:38:19.340 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:38:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:38:19.340 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:39:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:39:19.340 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:39:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:39:19.341 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:39:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:39:19.341 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:40:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:40:19.341 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:40:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:40:19.343 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:40:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:40:19.343 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:41:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:41:19.342 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:41:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:41:19.343 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:41:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:41:19.343 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:42:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:19.344 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:42:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:19.345 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:42:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:19.345 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:42:41 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:41.884 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 20 18:42:41 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:41.887 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection closed by peer Mar 20 18:42:42 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:42.904 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:42 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:42.905 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:43.905 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:42:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:43.906 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 20 18:42:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:43.906 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:42:43 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:43.907 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 2 seconds before reconnect Mar 20 18:42:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:46.058 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:46 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:46.059 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:48.060 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:42:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:48.060 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 20 18:42:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:48.061 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt timed out Mar 20 18:42:48 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:48.061 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: waiting 4 seconds before reconnect Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.070 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.070 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.071 28855 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.071 28954 WARNING ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Connection refused) Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.071 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 20 18:42:52 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:42:52.071 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: continuing to reconnect in the background but suppressing further logging Mar 20 18:43:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:43:19.346 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:43:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:43:19.346 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:43:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:43:19.346 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:44:12 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:12.387 28855 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 20 18:44:12 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:12.418 28954 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected Mar 20 18:44:12 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:12.462 28855 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=1, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:1d:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c9:e3:26:37:9c'}, ipsec=False) old=SB_Global() matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43 Mar 20 18:44:12 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:12.463 28855 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274 Mar 20 18:44:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:14.466 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7c54dcd3-ca2c-4aa5-86de-4794b9e11ad2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 Mar 20 18:44:14 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:14.466 28855 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 Mar 20 18:44:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:19.348 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:44:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:19.349 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:44:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:44:19.349 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:45:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:45:19.349 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:45:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:45:19.350 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:45:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:45:19.350 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:46:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:46:19.350 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:46:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:46:19.351 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:46:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:46:19.351 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:47:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:47:19.351 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:47:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:47:19.352 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:47:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:47:19.352 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:48:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:19.351 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404 Mar 20 18:48:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:19.352 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409 Mar 20 18:48:19 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:19.352 28855 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.612 28954 INFO eventlet.wsgi.server [-] (28954) wsgi exited, is_accepting=True Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.613 28954 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.613 28954 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.613 28954 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.617 28855 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.822 28855 INFO oslo_service.service [-] Caught SIGTERM, stopping children Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.823 28855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.823 28855 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.823 28855 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.823 28855 DEBUG oslo_service.service [-] Stop services. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:695 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.823 28855 DEBUG oslo_service.service [-] Killing children. stop /usr/lib/python3.9/site-packages/oslo_service/service.py:700 Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.824 28855 INFO oslo_service.service [-] Waiting on 1 children to exit Mar 20 18:48:26 edpm-compute-0 ovn_metadata_agent[28850]: 2026-03-20 18:48:26.824 28855 INFO oslo_service.service [-] Child 28954 exited with status 0 Mar 20 18:48:26 edpm-compute-0 conmon[28850]: conmon 9e30a6aa71993e3aeed7 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9e30a6aa71993e3aeed7350dfe5a2ae8e034f32cfb7c9bd40055c1c9b47726dc.scope/container/memory.events Mar 20 18:48:30 edpm-compute-0 podman[174463]: Error: no container with ID 9e30a6aa71993e3aeed7350dfe5a2ae8e034f32cfb7c9bd40055c1c9b47726dc found in database: no such container Mar 20 18:48:30 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 20 18:48:30 edpm-compute-0 podman[174478]: Error: no container with name or ID "ovn_metadata_agent" found: no such container Mar 20 18:48:30 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Control process exited, code=exited, status=125/n/a Mar 20 18:48:30 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Failed with result 'exit-code'. Mar 20 18:48:30 edpm-compute-0 systemd[1]: edpm_ovn_metadata_agent.service: Scheduled restart job, restart counter is at 1. Mar 20 18:48:30 edpm-compute-0 systemd[1]: Stopped ovn_metadata_agent container. Mar 20 18:48:30 edpm-compute-0 systemd[1]: Starting ovn_metadata_agent container... Mar 20 18:48:30 edpm-compute-0 edpm-start-podman-container[174502]: ovn_metadata_agent Mar 20 18:48:30 edpm-compute-0 edpm-start-podman-container[174501]: Creating additional drop-in dependency for "ovn_metadata_agent" (b55e5cb727fd61bf039fa27e4baf6c7b6972507d8a4a4daec5c0fa4d4b46da03) Mar 20 18:48:31 edpm-compute-0 systemd[1]: Started ovn_metadata_agent container.