\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [instance: 1d40a622-5897-42e0-a00b-d387818a170d] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
The Keystone service is temporarily unavailable.
Neutron server returns request_ids: ['req-3e1a459a-57f9-445b-9578-fa784076e786']
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] Traceback (most recent call last):
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9593, in _heal_instance_info_cache
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._require_nw_info_update(context, instance):
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9500, in _require_nw_info_update
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ports = self.network_api.list_ports(context, **search_opts)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return get_client(context).list_ports(**search_opts)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.list('ports', self.ports_path, retrieve_all,
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] for r in self._pagination(collection, path, **params):
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] res = self.get(path, params=params)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.retry_request("GET", action, body=body,
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.do_request(method, action, body=body,
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._handle_fault_response(status_code, replybody, resp)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] exception_handler_v20(status_code, error_body)
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] raise client_exc(message=error_message,
2025-12-16 08:42:57.820 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [instance: 1d40a622-5897-42e0-a00b-d387818a170d] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
The Keystone service is temporarily unavailable.
Neutron server returns request_ids: ['req-1d92887e-76a4-4adf-aa89-2532686f0bd3']
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] Traceback (most recent call last):
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9593, in _heal_instance_info_cache
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._require_nw_info_update(context, instance):
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9500, in _require_nw_info_update
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ports = self.network_api.list_ports(context, **search_opts)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return get_client(context).list_ports(**search_opts)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.list('ports', self.ports_path, retrieve_all,
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] for r in self._pagination(collection, path, **params):
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] res = self.get(path, params=params)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.retry_request("GET", action, body=body,
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.do_request(method, action, body=body,
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._handle_fault_response(status_code, replybody, resp)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] exception_handler_v20(status_code, error_body)
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] raise client_exc(message=error_message,
2025-12-16 08:43:59.793 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [instance: 1d40a622-5897-42e0-a00b-d387818a170d] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
The Keystone service is temporarily unavailable.
Neutron server returns request_ids: ['req-76d54c2e-29eb-4fa4-81c8-fcda8213058b']
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] Traceback (most recent call last):
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9593, in _heal_instance_info_cache
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._require_nw_info_update(context, instance):
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9500, in _require_nw_info_update
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ports = self.network_api.list_ports(context, **search_opts)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return get_client(context).list_ports(**search_opts)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.list('ports', self.ports_path, retrieve_all,
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] for r in self._pagination(collection, path, **params):
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] res = self.get(path, params=params)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.retry_request("GET", action, body=body,
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] return self.do_request(method, action, body=body,
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] self._handle_fault_response(status_code, replybody, resp)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] ret = obj(*args, **kwargs)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] exception_handler_v20(status_code, error_body)
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] raise client_exc(message=error_message,
2025-12-16 08:45:00.802 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 91f0e29f-53fd-42fb-8acc-48edc21f22ed: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}
2025-12-16 08:45:03.290 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056
2025-12-16 08:45:03.290 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Final resource view: name=np0005561306.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:45:03 up 1:57, 0 users, load average: 0.17, 0.26, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_a5dc8740602245aba61a384fd0802b18': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065
2025-12-16 08:45:03.351 2 ERROR nova.scheduler.client.report [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [req-7add8f01-2e63-4781-b5dc-e1b9623cb9a1] Failed to retrieve resource provider tree from placement API for UUID 91f0e29f-53fd-42fb-8acc-48edc21f22ed. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.
The proxy server received an invalid
2025-12-16 08:46:17.470 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] response from an upstream server.
2025-12-16 08:46:17.470 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] The proxy server could not handle the request
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 91f0e29f-53fd-42fb-8acc-48edc21f22ed: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}
2025-12-16 08:46:18.304 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056
2025-12-16 08:46:18.305 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Final resource view: name=np0005561306.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:46:18 up 1:59, 0 users, load average: 0.22, 0.25, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_a5dc8740602245aba61a384fd0802b18': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065
2025-12-16 08:46:18.403 2 ERROR nova.scheduler.client.report [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [req-db4a4c76-d764-457e-8dc3-914ca9e42597] Failed to retrieve resource provider tree from placement API for UUID 91f0e29f-53fd-42fb-8acc-48edc21f22ed. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.
The server is temporarily unable to service your
2025-12-16 08:47:02.710 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] request due to maintenance downtime or capacity
2025-12-16 08:47:02.710 2 ERROR nova.compute.manager [instance: 1d40a622-5897-42e0-a00b-d387818a170d] problems. Please try again later.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 91f0e29f-53fd-42fb-8acc-48edc21f22ed: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}
2025-12-16 08:47:09.435 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056
2025-12-16 08:47:09.435 2 DEBUG nova.compute.resource_tracker [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Final resource view: name=np0005561306.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:47:06 up 2:00, 0 users, load average: 0.14, 0.23, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_a5dc8740602245aba61a384fd0802b18': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065
2025-12-16 08:47:09.495 2 ERROR nova.scheduler.client.report [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] [req-f6aca491-1f58-44ec-976d-daaa4fc31161] Failed to retrieve resource provider tree from placement API for UUID 91f0e29f-53fd-42fb-8acc-48edc21f22ed. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}.
2025-12-16 08:47:09.496 2 DEBUG oslo_concurrency.lockutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error updating resources for node np0005561306.ooo.test.: nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 91f0e29f-53fd-42fb-8acc-48edc21f22ed
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager Traceback (most recent call last):
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10183, in _update_available_resource_for_node
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename,
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 886, in update_available_resource
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 360, in inner
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager return f(*args, **kwargs)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 993, in _update_available_resource
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager self._update(context, cn, startup=startup)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1247, in _update
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager return attempt.get(self._wrap_exception)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2])
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager raise value
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1177, in _update_to_placement
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager prov_tree = self.reportclient.get_provider_tree_and_ensure_root(
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 899, in get_provider_tree_and_ensure_root
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager self._ensure_resource_provider(
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 688, in _ensure_resource_provider
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager rps_to_refresh = self.get_providers_in_tree(context, uuid)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 551, in get_providers_in_tree
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid)
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 91f0e29f-53fd-42fb-8acc-48edc21f22ed
2025-12-16 08:47:09.496 2 ERROR nova.compute.manager
2025-12-16 08:47:09.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:09.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:09.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:09.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:10.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:10.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:15.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:15.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:15.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:15.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:15.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:15.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:20.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:20.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:20.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:20.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:20.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:20.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:25.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:25.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:25.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:25.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:25.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:25.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:30.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:30.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:30.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:30.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:30.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:30.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:35.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:35.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:35.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:35.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:35.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:35.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:40.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:40.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:40.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:40.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:40.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:40.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:45.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:45.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:45.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:45.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:45.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:45.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:50.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:50.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:50.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:50.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:50.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:50.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:55.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:55.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:47:55.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:47:55.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:47:55.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:47:55.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:00.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:00.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:00.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:00.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:00.423 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:48:00.424 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:48:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:05.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:05.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:05.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:05.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:05.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:05.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:10.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:10.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:10.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:10.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:10.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:10.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:15.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:15.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:15.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:15.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:15.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:15.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:20.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:20.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:20.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:20.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:20.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:20.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:25.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:25.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:25.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:25.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:25.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:25.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:30.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:30.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:30.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:30.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:30.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:30.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:35.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:35.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:35.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:35.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:35.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:35.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:40.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:40.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:40.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:40.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:40.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:40.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:41.828 2 WARNING nova.servicegroup.drivers.db [-] Lost connection to nova-conductor for reporting service status.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9e58448e0ba8453c93536002c48adf66
2025-12-16 08:48:41.829 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:48:45.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:50.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:50.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:50.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:50.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:50.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:50.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:55.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:55.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:48:55.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:48:55.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:48:55.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:48:55.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 7c1f86ef933a4a7b87b96b89ffdbb686
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2153, in _sync_scheduler_instance_info
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host,
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 7c1f86ef933a4a7b87b96b89ffdbb686
2025-12-16 08:49:00.431 2 ERROR oslo_service.periodic_task
2025-12-16 08:49:00.433 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:49:00.434 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9530
2025-12-16 08:49:00.434 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9534
2025-12-16 08:49:00.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:49:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:00.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:00.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:05.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:05.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:05.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:49:05.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:05.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:05.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:10.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:10.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:10.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:49:10.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:10.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:10.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:15.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:20.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:25.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:30.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:35.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:35.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:35.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5019 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:49:35.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:35.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:35.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:40.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:41.836 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:49:45.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:49:45.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:45.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:49:45.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:45.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:49:45.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:50.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:49:55.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a4ea05f51c3e40ed8ab9f925a843973f
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9535, in _heal_instance_info_cache
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a4ea05f51c3e40ed8ab9f925a843973f
2025-12-16 08:50:00.440 2 ERROR oslo_service.periodic_task
2025-12-16 08:50:00.442 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:50:00.442 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:50:00.443 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:50:00.443 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:50:00.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:05.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:50:05.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:50:05.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5053 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:50:05.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:50:05.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:05.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:50:10.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:15.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:20.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:25.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:30.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:35.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:40.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:41.843 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:50:45.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:50.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:50.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:50:55.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2abc7bd6a64347f9bc40389865ff580e
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9784, in _instance_usage_audit
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2abc7bd6a64347f9bc40389865ff580e
2025-12-16 08:51:00.448 2 ERROR oslo_service.periodic_task
2025-12-16 08:51:00.451 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:51:00.451 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:51:00.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:05.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:10.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:15.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:20.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:25.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:51:25.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:51:25.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:51:25.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:51:26.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:26.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:51:31.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:31.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:36.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:41.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:51:41.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:41.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:51:41.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:51:41.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:51:41.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:51:41.851 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:51:46.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:51.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:51:56.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c90f5690220849f784048f6116d2ffbb
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9894, in _sync_power_states
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host,
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c90f5690220849f784048f6116d2ffbb
2025-12-16 08:52:00.460 2 ERROR oslo_service.periodic_task
2025-12-16 08:52:00.461 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:52:00.462 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10147
2025-12-16 08:52:00.462 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:52:01.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:52:01.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:01.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:52:01.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:52:01.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:52:01.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:06.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:06.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:11.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:16.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:21.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:26.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:26.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:31.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:36.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:41.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:41.858 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:52:46.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:46.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:51.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:52:56.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 576517a6669a430cafe1e87b8a785d38
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10241, in update_available_resource
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10288, in _get_compute_nodes_in_db
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 576517a6669a430cafe1e87b8a785d38
2025-12-16 08:53:00.481 2 ERROR oslo_service.periodic_task
2025-12-16 08:53:00.483 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:53:00.483 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10796
2025-12-16 08:53:01.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:06.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:11.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:16.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:21.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:26.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:31.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:36.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:36.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:41.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:41.884 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.02 sec
2025-12-16 08:53:46.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:51.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:53:56.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d362e2b141cf40ed8d606b8199094ea3
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10803, in _run_pending_deletes
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters(
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d362e2b141cf40ed8d606b8199094ea3
2025-12-16 08:54:00.496 2 ERROR oslo_service.periodic_task
2025-12-16 08:54:00.497 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:54:00.498 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10834
2025-12-16 08:54:01.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:54:06.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:11.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:16.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:54:16.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:54:21.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:26.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:54:31.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:36.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:41.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:41.889 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec
2025-12-16 08:54:46.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:51.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:51.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:54:56.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9eb2012221954ac598048a784a54924b
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10837, in _cleanup_incomplete_migrations
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context,
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9eb2012221954ac598048a784a54924b
2025-12-16 08:55:00.508 2 ERROR oslo_service.periodic_task
2025-12-16 08:55:00.509 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:55:01.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:06.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:11.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:11.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:16.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:21.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:21.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:21.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:55:21.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:21.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:21.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:26.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:26.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:26.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:55:26.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:26.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:26.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:31.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:31.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:31.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:55:31.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:31.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:31.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:36.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:41.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:41.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:41.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 08:55:41.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:41.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:41.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 08:55:41.895 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec
2025-12-16 08:55:46.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 08:55:46.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:51.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:51.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:56.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:55:56.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e3f25f402b0e4bdfb7769fc1cf030c12
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10933, in _cleanup_expired_console_auth_tokens
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context)
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e3f25f402b0e4bdfb7769fc1cf030c12
2025-12-16 08:56:00.520 2 ERROR oslo_service.periodic_task
2025-12-16 08:56:01.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:01.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:06.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:06.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:11.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:11.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:16.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:16.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:21.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:21.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:26.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:26.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:31.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:31.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:36.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:36.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:41.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:41.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:41.901 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:56:46.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:46.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:51.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:51.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:56.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:56:56.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:00.524 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:57:00.525 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:57:01.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:01.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:06.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:06.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:11.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:11.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:16.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:16.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:21.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:21.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:26.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:26.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:31.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:31.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:36.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:36.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:41.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:41.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:41.910 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:57:46.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:46.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:51.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:51.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:56.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:57:56.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f8e34602fa5848cb9517cfdc9a31257c
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2153, in _sync_scheduler_instance_info
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host,
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f8e34602fa5848cb9517cfdc9a31257c
2025-12-16 08:58:00.538 2 ERROR oslo_service.periodic_task
2025-12-16 08:58:00.541 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:58:00.541 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9530
2025-12-16 08:58:00.541 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9534
2025-12-16 08:58:01.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:01.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:06.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:06.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:11.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:11.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:16.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:16.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:21.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:21.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:26.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:26.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:31.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:31.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:36.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:36.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:41.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:41.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:41.918 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:58:46.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:46.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:51.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:51.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:56.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:58:56.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID be2dc285c22747c0b7bf560586a7d832
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9535, in _heal_instance_info_cache
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID be2dc285c22747c0b7bf560586a7d832
2025-12-16 08:59:00.547 2 ERROR oslo_service.periodic_task
2025-12-16 08:59:00.548 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:59:00.549 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:59:00.549 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:59:00.549 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 08:59:01.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:01.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:06.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:06.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:11.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:11.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:16.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:16.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:21.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:21.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:26.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:26.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:31.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:31.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:36.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:36.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:41.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:41.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:41.924 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 08:59:46.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:46.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:51.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:51.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:56.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 08:59:56.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d60fbf0761f24af8a17aba37322bfd73
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9784, in _instance_usage_audit
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d60fbf0761f24af8a17aba37322bfd73
2025-12-16 09:00:00.556 2 ERROR oslo_service.periodic_task
2025-12-16 09:00:00.557 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:00:00.558 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:00:00.558 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10147
2025-12-16 09:00:00.558 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:00:01.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:01.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:06.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:06.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:11.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:11.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:16.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:16.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:21.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:21.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:26.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:26.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:31.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:31.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:36.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:36.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:41.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:41.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:41.931 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:00:46.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:46.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:51.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:51.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:56.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:00:56.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0552e435fc8b472b8de259a51c268edf
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10241, in update_available_resource
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10288, in _get_compute_nodes_in_db
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0552e435fc8b472b8de259a51c268edf
2025-12-16 09:01:00.565 2 ERROR oslo_service.periodic_task
2025-12-16 09:01:00.566 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:01:01.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:01.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:06.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:06.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:11.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:11.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:16.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:16.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:21.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:21.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:26.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:26.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:31.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:31.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:36.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:36.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:41.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:41.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:41.937 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:01:46.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:46.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:51.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:51.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:56.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:01:56.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._cleanup_running_deleted_instances: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2e549a2c6a144c92932454078c2b54fa
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10339, in _cleanup_running_deleted_instances
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task instances = self._running_deleted_instances(context)
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10394, in _running_deleted_instances
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task instances = self._get_instances_on_driver(context, filters)
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 656, in _get_instances_on_driver
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task local_instances = objects.InstanceList.get_by_filters(
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2e549a2c6a144c92932454078c2b54fa
2025-12-16 09:02:00.576 2 ERROR oslo_service.periodic_task
2025-12-16 09:02:00.578 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:02:00.578 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10796
2025-12-16 09:02:01.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:01.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:06.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:06.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:11.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:11.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:16.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:16.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:21.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:21.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:26.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:26.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:31.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:31.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:36.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:36.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:41.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:41.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:41.944 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:02:46.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:46.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:51.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:51.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:56.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:02:56.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4d75f330017d4c7982a08363e8525627
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10803, in _run_pending_deletes
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters(
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4d75f330017d4c7982a08363e8525627
2025-12-16 09:03:00.584 2 ERROR oslo_service.periodic_task
2025-12-16 09:03:00.587 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:03:00.587 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10834
2025-12-16 09:03:01.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:01.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:06.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:06.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:11.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:11.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:16.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:16.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:21.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:21.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:26.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:26.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:31.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:31.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:36.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:36.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:41.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:41.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:41.950 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:03:46.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:46.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:51.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:03:51.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:03:51.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:03:51.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:03:51.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:51.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:03:56.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:03:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:03:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:03:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:03:56.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:03:56.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6fbfe0e78a714a7abc89f1ac685fc1c3
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10837, in _cleanup_incomplete_migrations
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context,
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6fbfe0e78a714a7abc89f1ac685fc1c3
2025-12-16 09:04:00.593 2 ERROR oslo_service.periodic_task
2025-12-16 09:04:00.594 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:04:01.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:01.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:01.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:01.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:01.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:01.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:06.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:06.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:06.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:06.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:06.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:06.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:11.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:11.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:11.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:11.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:11.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:11.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:16.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:16.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:16.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:16.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:16.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:16.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:21.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:21.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:21.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:21.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:21.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:21.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:26.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:26.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:26.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:26.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:26.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:26.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:31.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:31.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:31.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:31.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:31.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:31.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:36.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:36.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:36.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:36.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:36.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:36.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:41.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:41.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:41.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:41.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:41.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:41.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:41.958 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:04:46.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:46.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:46.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:46.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:46.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:46.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:51.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:51.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:51.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:51.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:52.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:52.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:57.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:57.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:04:57.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:04:57.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:04:57.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:04:57.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 43dc865f6b8a4e3e83d54244b39a9bb5
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10933, in _cleanup_expired_console_auth_tokens
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context)
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 43dc865f6b8a4e3e83d54244b39a9bb5
2025-12-16 09:05:00.599 2 ERROR oslo_service.periodic_task
2025-12-16 09:05:02.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:02.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:02.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:02.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:02.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:02.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:07.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:07.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:07.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:07.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:07.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:07.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:12.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:12.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:12.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:12.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:12.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:12.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:17.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:17.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:17.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:17.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:17.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:17.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:22.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:22.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:22.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:22.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:22.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:22.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:22.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:27.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:27.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:27.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:27.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:27.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:27.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:31.712 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:05:31.712 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:05:32.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:32.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:32.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:32.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:32.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:32.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:37.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:37.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:37.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:37.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:37.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:37.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:41.965 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:05:42.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:42.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:42.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:42.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:42.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:42.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:47.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:47.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:47.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:47.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:47.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:47.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:52.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:52.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:52.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:52.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:52.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:52.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:57.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:57.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:05:57.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:05:57.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:05:57.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:05:57.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:02.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:02.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:02.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:02.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:02.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:02.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:07.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:07.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:07.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:07.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:07.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:07.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:12.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:12.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:12.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:12.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:12.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:12.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:17.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:17.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:17.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:17.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:17.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:17.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:22.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:22.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:22.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:22.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:22.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:22.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:27.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:27.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:27.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:27.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:27.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:27.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c36f5c7a3e8747148592c178e95c39ce
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2153, in _sync_scheduler_instance_info
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host,
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c36f5c7a3e8747148592c178e95c39ce
2025-12-16 09:06:31.719 2 ERROR oslo_service.periodic_task
2025-12-16 09:06:31.721 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:06:31.721 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9530
2025-12-16 09:06:31.721 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9534
2025-12-16 09:06:32.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:32.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:32.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:32.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:32.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:32.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:37.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:37.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:37.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:37.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:37.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:37.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:41.971 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:06:42.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:42.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:42.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:42.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:42.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:42.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:47.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:47.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:52.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:57.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:57.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:06:57.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:06:57.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:06:58.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:06:58.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:03.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:03.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:03.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:03.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:03.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:03.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:08.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:08.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:08.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:08.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:08.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:08.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:13.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:13.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:13.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:13.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:13.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:13.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:18.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:18.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:18.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:18.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:18.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:18.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:23.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:23.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:23.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:23.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:23.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:23.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:28.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:28.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:28.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3162da4df69f4df891f914100e8818a7
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9535, in _heal_instance_info_cache
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3162da4df69f4df891f914100e8818a7
2025-12-16 09:07:31.727 2 ERROR oslo_service.periodic_task
2025-12-16 09:07:31.729 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:07:31.729 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:07:31.729 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:07:31.730 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:07:33.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:38.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:38.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:38.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:38.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:38.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:38.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:41.977 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:07:43.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:43.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:43.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:43.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:43.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:43.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:48.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:53.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:53.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:07:53.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:07:53.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:53.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:53.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:07:58.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:07:58.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:03.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:03.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:03.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:03.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:03.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:03.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:08.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:13.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:13.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:13.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:13.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:13.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:13.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:18.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:18.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:18.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:18.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:18.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:18.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:23.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:23.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:23.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:23.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:23.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:23.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:28.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:28.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:28.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:28.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:28.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:28.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 55aeee34f15345689b5b26066fc8c523
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9784, in _instance_usage_audit
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 55aeee34f15345689b5b26066fc8c523
2025-12-16 09:08:31.735 2 ERROR oslo_service.periodic_task
2025-12-16 09:08:31.737 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:08:31.738 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:08:33.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:33.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:33.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:33.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:33.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:33.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:38.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:38.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:38.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:38.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:38.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:38.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:41.984 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:08:43.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:43.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:43.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:43.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:43.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:43.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:48.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:48.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:48.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:48.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:48.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:48.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:53.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:08:53.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:53.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:08:53.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:53.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:08:53.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:58.472 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer
2025-12-16 09:08:58.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:08:59.200 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-12-16 09:08:59.215 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:08:59.233 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:08:59.487 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:00.503 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:01.603 2 INFO oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] Reconnected to AMQP server on np0005561301.internalapi.ooo.test:5672 via [amqp] client with port 38192.
2025-12-16 09:09:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:05.409 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer
2025-12-16 09:09:05.410 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer
2025-12-16 09:09:06.443 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:06.444 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:07.474 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:07.476 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:08.495 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:08.495 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:08.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:09:10.515 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:10.516 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:11.542 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:11.543 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:12.570 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:12.571 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:13.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:14.307 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-12-16 09:09:14.322 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:14.337 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:14.352 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:15.368 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:15.384 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:15.400 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:16.600 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:16.601 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:17.627 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:17.628 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.419 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.434 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.447 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.655 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.656 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:18.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:23.470 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:23.486 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:23.501 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:23.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:24.687 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:24.688 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:25.715 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:25.716 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:26.745 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:26.746 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:28.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:30.533 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:30.555 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:30.571 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4dfabef800e14c36aef481be44dd19a2
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9894, in _sync_power_states
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host,
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4dfabef800e14c36aef481be44dd19a2
2025-12-16 09:09:31.743 2 ERROR oslo_service.periodic_task
2025-12-16 09:09:31.745 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:09:31.745 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10147
2025-12-16 09:09:31.745 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:09:33.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:34.779 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:34.780 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:35.807 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:35.809 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:36.840 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:36.842 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:38.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:39.605 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:39.619 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:39.632 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:41.991 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-12-16 09:09:42.006 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:42.019 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:42.032 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:43.046 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:43.059 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:43.072 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:43.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:43.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:46.091 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:46.105 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:46.118 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:46.877 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:46.878 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:47.904 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:47.905 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:48.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:09:48.931 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:48.932 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:50.662 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:50.677 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:50.691 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:51.135 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:51.144 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:51.152 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:53.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:09:58.173 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:58.186 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:58.199 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:09:58.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:00.969 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:00.969 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:01.998 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:01.999 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:03.026 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:03.027 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:03.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:10:03.719 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:03.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:03.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5015 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:10:03.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:10:03.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:10:03.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:03.735 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:03.749 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:07.223 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:07.237 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:07.251 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:08.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:08.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:13.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:13.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:17.068 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:17.069 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.097 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.098 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.276 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.289 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.303 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:18.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:18.777 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.791 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:18.805 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:19.127 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:19.128 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:23.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:23.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:28.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:28.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:31.327 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:31.340 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:31.353 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:33.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:33.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:35.171 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:35.172 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:35.839 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:35.853 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:35.867 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:36.199 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:36.200 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:37.235 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:37.237 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:38.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:38.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:43.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:43.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:46.381 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:46.394 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:46.408 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:48.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:53.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:54.903 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:54.917 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:54.932 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:55.281 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:55.282 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:56.308 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:56.309 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:57.337 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:57.337 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:10:58.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:10:58.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:10:58.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:10:58.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:10:58.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:03.438 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:03.455 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:03.469 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:03.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:08.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:08.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:08.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:11:08.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:08.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:13.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:15.973 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:15.988 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:16.003 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:17.385 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:17.386 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:18.403 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:18.404 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:18.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:19.430 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:19.431 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:22.494 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:22.508 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:22.522 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:23.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:28.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:28.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:28.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:11:28.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:28.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:28.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:33.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:38.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:39.055 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:39.069 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:39.083 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:41.486 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:41.490 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561301.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:42.514 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:42.515 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561302.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.543 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.543 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] AMQP server on np0005561303.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.559 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.572 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.587 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-12-16 09:11:43.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:43.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:48.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:48.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:48.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:11:48.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:48.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:48.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:11:58.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:58.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:11:58.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:11:58.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:58.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:11:58.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:03.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:03.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:07.691 2 INFO oslo.messaging._drivers.impl_rabbit [-] [eeedce34-bb39-4266-bc4d-6f73ea11510b] Reconnected to AMQP server on np0005561301.internalapi.ooo.test:5672 via [amqp] client with port 58830.
2025-12-16 09:12:07.713 2 INFO oslo.messaging._drivers.impl_rabbit [-] [35046467-2de7-460e-b76a-538d2d270ed1] Reconnected to AMQP server on np0005561301.internalapi.ooo.test:5672 via [amqp] client with port 58840.
2025-12-16 09:12:07.783 2 INFO nova.servicegroup.drivers.db [-] Recovered from being unable to report status.
2025-12-16 09:12:07.783 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 135.79 sec
2025-12-16 09:12:08.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:08.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:08.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:08.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:08.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:08.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:13.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:18.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:18.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:18.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:18.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:18.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:18.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:23.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:28.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:28.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:28.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:28.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:28.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:28.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db [-] Unexpected error while reporting service status: oslo_messaging.rpc.client.RemoteError: Remote error: DBConnectionError (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')
[SQL: UPDATE services SET updated_at=%(updated_at)s, report_count=%(report_count)s, last_seen_up=%(last_seen_up)s WHERE services.id = %(services_id)s]
[parameters: {'updated_at': datetime.datetime(2025, 12, 16, 9, 12, 27, 807709), 'report_count': 133, 'last_seen_up': datetime.datetime(2025, 12, 16, 9, 12, 27, 807214), 'services_id': 44}]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
['Traceback (most recent call last):\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context\n self.dialect.do_execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute\n cursor.execute(statement, parameters)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 163, in execute\n result = self._query(query)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 321, in _query\n conn.query(q)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 505, in query\n self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 724, in _read_query_result\n result.read()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 1069, in read\n first_packet = self.connection._read_packet()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 646, in _read_packet\n packet_header = self._read_bytes(4)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 698, in _read_bytes\n raise err.OperationalError(\n', "pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')\n", '\nThe above exception was the direct cause of the following exception:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/lib/python3.9/site-packages/nova/conductor/manager.py", line 140, in _object_dispatch\n return getattr(target, method)(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper\n return fn(self, *args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/nova/objects/service.py", line 465, in save\n db_service = db.service_update(self._context, self.id, updates)\n', ' File "/usr/lib/python3.9/site-packages/nova/db/api.py", line 182, in service_update\n return IMPL.service_update(context, service_id, values)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/nova/db/sqlalchemy/api.py", line 221, in wrapped\n return f(context, *args, **kwargs)\n', ' File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__\n next(self.gen)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 1064, in _transaction_scope\n yield resource\n', ' File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__\n next(self.gen)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 666, in _session\n self.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 663, in _session\n self._end_session_transaction(self.session)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 691, in _end_session_transaction\n session.commit()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 1046, in commit\n self.transaction.commit()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 504, in commit\n self._prepare_impl()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl\n self.session.flush()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2540, in flush\n self._flush(objects)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2682, in _flush\n transaction.rollback(_capture_exception=True)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/langhelpers.py", line 68, in __exit__\n compat.raise_(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/compat.py", line 182, in raise_\n raise exception\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2642, in _flush\n flush_context.execute()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute\n rec.execute(self)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 586, in execute\n persistence.save_obj(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 230, in save_obj\n _emit_update_statements(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 994, in _emit_update_statements\n c = cached_connections[connection].execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1011, in execute\n return meth(self, multiparams, params)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection\n return connection._execute_clauseelement(self, multiparams, params)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement\n ret = self._execute_context(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context\n self._handle_dbapi_exception(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1508, in _handle_dbapi_exception\n util.raise_(newraise, with_traceback=exc_info[2], from_=e)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/compat.py", line 182, in raise_\n raise exception\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context\n self.dialect.do_execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute\n cursor.execute(statement, parameters)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 163, in execute\n result = self._query(query)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 321, in _query\n conn.query(q)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 505, in query\n self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 724, in _read_query_result\n result.read()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 1069, in read\n first_packet = self.connection._read_packet()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 646, in _read_packet\n packet_header = self._read_bytes(4)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 698, in _read_bytes\n raise err.OperationalError(\n', "oslo_db.exception.DBConnectionError: (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')\n[SQL: UPDATE services SET updated_at=%(updated_at)s, report_count=%(report_count)s, last_seen_up=%(last_seen_up)s WHERE services.id = %(services_id)s]\n[parameters: {'updated_at': datetime.datetime(2025, 12, 16, 9, 12, 27, 807709), 'report_count': 133, 'last_seen_up': datetime.datetime(2025, 12, 16, 9, 12, 27, 807214), 'services_id': 44}]\n(Background on this error at: http://sqlalche.me/e/13/e3q8)\n"].
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db Traceback (most recent call last):
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py", line 92, in _report_state
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db service.service_ref.save()
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db updates, result = self.indirection_api.object_action(
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db return cctxt.call(context, 'object_action', objinst=objinst,
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db return self._driver.send(target, ctxt, message,
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 673, in _send
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db raise result
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db oslo_messaging.rpc.client.RemoteError: Remote error: DBConnectionError (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db [SQL: UPDATE services SET updated_at=%(updated_at)s, report_count=%(report_count)s, last_seen_up=%(last_seen_up)s WHERE services.id = %(services_id)s]
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db [parameters: {'updated_at': datetime.datetime(2025, 12, 16, 9, 12, 27, 807709), 'report_count': 133, 'last_seen_up': datetime.datetime(2025, 12, 16, 9, 12, 27, 807214), 'services_id': 44}]
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db (Background on this error at: http://sqlalche.me/e/13/e3q8)
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db ['Traceback (most recent call last):\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context\n self.dialect.do_execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute\n cursor.execute(statement, parameters)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 163, in execute\n result = self._query(query)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 321, in _query\n conn.query(q)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 505, in query\n self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 724, in _read_query_result\n result.read()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 1069, in read\n first_packet = self.connection._read_packet()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 646, in _read_packet\n packet_header = self._read_bytes(4)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 698, in _read_bytes\n raise err.OperationalError(\n', "pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')\n", '\nThe above exception was the direct cause of the following exception:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/lib/python3.9/site-packages/nova/conductor/manager.py", line 140, in _object_dispatch\n return getattr(target, method)(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper\n return fn(self, *args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/nova/objects/service.py", line 465, in save\n db_service = db.service_update(self._context, self.id, updates)\n', ' File "/usr/lib/python3.9/site-packages/nova/db/api.py", line 182, in service_update\n return IMPL.service_update(context, service_id, values)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/nova/db/sqlalchemy/api.py", line 221, in wrapped\n return f(context, *args, **kwargs)\n', ' File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__\n next(self.gen)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 1064, in _transaction_scope\n yield resource\n', ' File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__\n next(self.gen)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 666, in _session\n self.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 663, in _session\n self._end_session_transaction(self.session)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 691, in _end_session_transaction\n session.commit()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 1046, in commit\n self.transaction.commit()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 504, in commit\n self._prepare_impl()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl\n self.session.flush()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2540, in flush\n self._flush(objects)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2682, in _flush\n transaction.rollback(_capture_exception=True)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/langhelpers.py", line 68, in __exit__\n compat.raise_(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/compat.py", line 182, in raise_\n raise exception\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/session.py", line 2642, in _flush\n flush_context.execute()\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute\n rec.execute(self)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/unitofwork.py", line 586, in execute\n persistence.save_obj(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 230, in save_obj\n _emit_update_statements(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/orm/persistence.py", line 994, in _emit_update_statements\n c = cached_connections[connection].execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1011, in execute\n return meth(self, multiparams, params)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection\n return connection._execute_clauseelement(self, multiparams, params)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement\n ret = self._execute_context(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context\n self._handle_dbapi_exception(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1508, in _handle_dbapi_exception\n util.raise_(newraise, with_traceback=exc_info[2], from_=e)\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/util/compat.py", line 182, in raise_\n raise exception\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context\n self.dialect.do_execute(\n', ' File "/usr/lib64/python3.9/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute\n cursor.execute(statement, parameters)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 163, in execute\n result = self._query(query)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/cursors.py", line 321, in _query\n conn.query(q)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 505, in query\n self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 724, in _read_query_result\n result.read()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 1069, in read\n first_packet = self.connection._read_packet()\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 646, in _read_packet\n packet_header = self._read_bytes(4)\n', ' File "/usr/lib/python3.9/site-packages/pymysql/connections.py", line 698, in _read_bytes\n raise err.OperationalError(\n', "oslo_db.exception.DBConnectionError: (pymysql.err.OperationalError) (2013, 'Lost connection to MySQL server during query')\n[SQL: UPDATE services SET updated_at=%(updated_at)s, report_count=%(report_count)s, last_seen_up=%(last_seen_up)s WHERE services.id = %(services_id)s]\n[parameters: {'updated_at': datetime.datetime(2025, 12, 16, 9, 12, 27, 807709), 'report_count': 133, 'last_seen_up': datetime.datetime(2025, 12, 16, 9, 12, 27, 807214), 'services_id': 44}]\n(Background on this error at: http://sqlalche.me/e/13/e3q8)\n"].
2025-12-16 09:12:29.113 2 ERROR nova.servicegroup.drivers.db
2025-12-16 09:12:33.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:33.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:37.815 2 INFO nova.servicegroup.drivers.db [-] Recovered from being unable to report status.
2025-12-16 09:12:38.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:38.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:38.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:38.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:39.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:39.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:44.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:49.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:49.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:49.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:49.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:49.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:49.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:54.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:54.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:54.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:54.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:54.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:54.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:59.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:59.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:12:59.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:12:59.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:12:59.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:12:59.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:13:04.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bf9e7526fba349baa04c68be3c6ea46d
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task _queue.Empty
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task task(self, context)
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10241, in update_available_resource
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10288, in _get_compute_nodes_in_db
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bf9e7526fba349baa04c68be3c6ea46d
2025-12-16 09:13:04.176 2 ERROR oslo_service.periodic_task
2025-12-16 09:13:04.178 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:13:04.179 2 DEBUG oslo_concurrency.lockutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355
2025-12-16 09:13:04.179 2 DEBUG oslo_concurrency.lockutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Lock "storage-registry-lock" released by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367
2025-12-16 09:13:04.180 2 DEBUG oslo_concurrency.lockutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355
2025-12-16 09:13:04.180 2 DEBUG oslo_concurrency.lockutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Lock "storage-registry-lock" released by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367
2025-12-16 09:13:04.212 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100
2025-12-16 09:13:04.233 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
2025-12-16 09:13:04.233 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Image id bfbd3b30-a6ec-4347-8cc5-cbaa017c7d94 yields fingerprint 18f2db0af0cae4fdb800c69b648454efeaee9749 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
2025-12-16 09:13:04.234 2 INFO nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] image bfbd3b30-a6ec-4347-8cc5-cbaa017c7d94 at (/var/lib/nova/instances/_base/18f2db0af0cae4fdb800c69b648454efeaee9749): checking
2025-12-16 09:13:04.234 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] image bfbd3b30-a6ec-4347-8cc5-cbaa017c7d94 at (/var/lib/nova/instances/_base/18f2db0af0cae4fdb800c69b648454efeaee9749): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
2025-12-16 09:13:04.237 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Image id yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
2025-12-16 09:13:04.238 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] 1d40a622-5897-42e0-a00b-d387818a170d is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
2025-12-16 09:13:04.238 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] 1d40a622-5897-42e0-a00b-d387818a170d has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
2025-12-16 09:13:04.238 2 DEBUG oslo_concurrency.processutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d40a622-5897-42e0-a00b-d387818a170d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
2025-12-16 09:13:04.312 2 DEBUG oslo_concurrency.processutils [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d40a622-5897-42e0-a00b-d387818a170d/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
2025-12-16 09:13:04.313 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Instance 1d40a622-5897-42e0-a00b-d387818a170d is backed by 18f2db0af0cae4fdb800c69b648454efeaee9749 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
2025-12-16 09:13:04.313 2 INFO nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Active base files: /var/lib/nova/instances/_base/18f2db0af0cae4fdb800c69b648454efeaee9749
2025-12-16 09:13:04.314 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
2025-12-16 09:13:04.314 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
2025-12-16 09:13:04.314 2 DEBUG nova.virt.libvirt.imagecache [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
2025-12-16 09:13:04.316 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:13:04.316 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10796
2025-12-16 09:13:04.340 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10805
2025-12-16 09:13:04.340 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:13:04.341 2 DEBUG nova.compute.manager [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10834
2025-12-16 09:13:04.362 2 DEBUG oslo_service.periodic_task [req-844a8d49-eef6-44d0-a7d3-26ee264cec17 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-12-16 09:13:09.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:13:09.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
2025-12-16 09:13:09.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
2025-12-16 09:13:09.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
2025-12-16 09:13:09.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-12-16 09:13:09.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519