\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
The Keystone service is temporarily unavailable.
Neutron server returns request_ids: ['req-abe54c4b-6288-4d06-99be-a58c675cd8c0']
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] Traceback (most recent call last):
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9577, in _heal_instance_info_cache
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._require_nw_info_update(context, instance):
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9484, in _require_nw_info_update
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ports = self.network_api.list_ports(context, **search_opts)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return get_client(context).list_ports(**search_opts)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.list('ports', self.ports_path, retrieve_all,
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] for r in self._pagination(collection, path, **params):
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] res = self.get(path, params=params)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.retry_request("GET", action, body=body,
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.do_request(method, action, body=body,
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._handle_fault_response(status_code, replybody, resp)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] exception_handler_v20(status_code, error_body)
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] raise client_exc(message=error_message,
2025-10-14 08:34:58.909 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
The Keystone service is temporarily unavailable.
Neutron server returns request_ids: ['req-c6d8df11-8a6a-488f-b6a1-f64c942fb071']
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] Traceback (most recent call last):
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9577, in _heal_instance_info_cache
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._require_nw_info_update(context, instance):
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9484, in _require_nw_info_update
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ports = self.network_api.list_ports(context, **search_opts)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return get_client(context).list_ports(**search_opts)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.list('ports', self.ports_path, retrieve_all,
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] for r in self._pagination(collection, path, **params):
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] res = self.get(path, params=params)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.retry_request("GET", action, body=body,
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.do_request(method, action, body=body,
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._handle_fault_response(status_code, replybody, resp)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] exception_handler_v20(status_code, error_body)
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] raise client_exc(message=error_message,
2025-10-14 08:36:00.925 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] neutronclient.common.exceptions.ServiceUnavailable: The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider a184a243-ffdb-4617-a020-38479983bf8d: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}
2025-10-14 08:36:56.487 2 DEBUG nova.compute.resource_tracker [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056
2025-10-14 08:36:56.487 2 DEBUG nova.compute.resource_tracker [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Final resource view: name=np0005486752.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065
2025-10-14 08:36:56.571 2 ERROR nova.scheduler.client.report [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] [req-810b7712-e03e-46af-ba77-699cb004ffff] Failed to retrieve resource provider tree from placement API for UUID a184a243-ffdb-4617-a020-38479983bf8d. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.
The proxy server received an invalid
2025-10-14 08:37:10.158 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] response from an upstream server.
2025-10-14 08:37:10.158 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] The proxy server could not handle the request
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider a184a243-ffdb-4617-a020-38479983bf8d: {"message": "The server is currently unavailable. Please try again at a later time.
\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}
2025-10-14 08:37:56.524 2 DEBUG nova.compute.resource_tracker [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056
2025-10-14 08:37:56.525 2 DEBUG nova.compute.resource_tracker [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Final resource view: name=np0005486752.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065
2025-10-14 08:37:56.604 2 ERROR nova.scheduler.client.report [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] [req-f89cbef5-ae5d-47ef-94f9-e9be1b9dcb53] Failed to retrieve resource provider tree from placement API for UUID a184a243-ffdb-4617-a020-38479983bf8d. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.
No server is available to handle this request.
_handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.ServiceUnavailable:
503 Service Unavailable
No server is available to handle this request.
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] Traceback (most recent call last):
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9577, in _heal_instance_info_cache
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._require_nw_info_update(context, instance):
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9484, in _require_nw_info_update
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ports = self.network_api.list_ports(context, **search_opts)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return get_client(context).list_ports(**search_opts)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.list('ports', self.ports_path, retrieve_all,
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] for r in self._pagination(collection, path, **params):
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] res = self.get(path, params=params)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.retry_request("GET", action, body=body,
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] return self.do_request(method, action, body=body,
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] self._handle_fault_response(status_code, replybody, resp)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] ret = obj(*args, **kwargs)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] exception_handler_v20(status_code, error_body)
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] raise client_exc(message=error_message,
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] neutronclient.common.exceptions.ServiceUnavailable:
503 Service Unavailable
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83] No server is available to handle this request.
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83]
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83]
2025-10-14 08:38:03.885 2 ERROR nova.compute.manager [instance: 556cc523-273b-4aa5-a2de-446f1aaace83]
2025-10-14 08:38:04.746 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:38:05.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:06.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:10.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:11.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:15.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:16.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:20.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:21.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:25.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:26.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:30.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:31.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:35.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:36.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:40.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:41.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:45.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:46.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:50.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:51.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:55.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:56.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:38:57.783 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:38:57.784 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:38:57.784 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131
2025-10-14 08:38:57.784 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:39:00.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:01.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:05.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:06.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:10.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:12.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:15.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:17.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:20.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:22.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:25.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:27.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:30.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:32.056 2 WARNING nova.servicegroup.drivers.db [-] Lost connection to nova-conductor for reporting service status.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3140f20e26314affb88ebfc3451e99ac
2025-10-14 08:39:32.058 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:39:32.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:35.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:37.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:40.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:42.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:45.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:47.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:50.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:52.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:55.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:57.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID ebc552f690874a3f98e3aff04a331324
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID ebc552f690874a3f98e3aff04a331324
2025-10-14 08:39:57.793 2 ERROR oslo_service.periodic_task
2025-10-14 08:39:58.799 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:39:58.799 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:39:58.800 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514
2025-10-14 08:39:58.800 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518
2025-10-14 08:40:00.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:02.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:05.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:07.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:10.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:12.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:15.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:17.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:20.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:22.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:25.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:27.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:30.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:32.064 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:40:32.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:35.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:37.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:40.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:42.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:47.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:50.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:52.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:55.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:57.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d69e358d15704bc59fa2cdd587d370bc
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d69e358d15704bc59fa2cdd587d370bc
2025-10-14 08:40:58.806 2 ERROR oslo_service.periodic_task
2025-10-14 08:40:58.807 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:40:58.807 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:40:58.808 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:40:58.808 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:41:00.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:02.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:05.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:07.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:10.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:12.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:15.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:17.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:20.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:22.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:25.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:27.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:30.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:32.072 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:41:32.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:35.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:37.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:40.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:43.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:45.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:48.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:50.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:53.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:55.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:58.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4e52841a8c954b6197094a5888f0b47a
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4e52841a8c954b6197094a5888f0b47a
2025-10-14 08:41:58.814 2 ERROR oslo_service.periodic_task
2025-10-14 08:41:58.816 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:41:58.816 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:41:58.817 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131
2025-10-14 08:41:58.817 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:42:00.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:03.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:05.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:08.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:10.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:13.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:15.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:18.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:20.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:23.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:26.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:28.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:31.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:32.078 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:42:33.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:36.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:38.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:41.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:43.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:46.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:48.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:51.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:53.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:56.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:58.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b1922f3a04f74328a78b94f9c00e51d5
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b1922f3a04f74328a78b94f9c00e51d5
2025-10-14 08:42:58.824 2 ERROR oslo_service.periodic_task
2025-10-14 08:42:58.826 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:42:58.826 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780
2025-10-14 08:43:01.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:03.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:06.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:08.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:11.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:13.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:16.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:18.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:21.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:23.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:26.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:28.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:31.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:32.085 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:43:33.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:36.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:38.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:41.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:43.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:46.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:48.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:51.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:53.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:56.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:58.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3c23650aaa4a4c7786f1eacc2dad5fe0
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters(
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3c23650aaa4a4c7786f1eacc2dad5fe0
2025-10-14 08:43:58.833 2 ERROR oslo_service.periodic_task
2025-10-14 08:43:58.835 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:43:58.835 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818
2025-10-14 08:44:01.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:03.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:06.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:08.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:11.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:13.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:16.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:18.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:21.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:23.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:26.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:28.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:32.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:32.092 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:44:33.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:37.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:38.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:42.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:43.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:47.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:48.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:52.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:53.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:57.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:58.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID fa86fb4742f24321a24ad9b6a48a8095
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context,
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID fa86fb4742f24321a24ad9b6a48a8095
2025-10-14 08:44:58.842 2 ERROR oslo_service.periodic_task
2025-10-14 08:44:58.844 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:45:02.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:03.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:07.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:08.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:12.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:13.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:17.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:18.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:22.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:23.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:27.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:28.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:32.097 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec
2025-10-14 08:45:32.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:33.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:37.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:38.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:42.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:43.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:47.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:48.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:52.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:53.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:57.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:58.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 10c2cf2cec8d4a97b6882c4a6afbaa62
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context)
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 10c2cf2cec8d4a97b6882c4a6afbaa62
2025-10-14 08:45:58.850 2 ERROR oslo_service.periodic_task
2025-10-14 08:46:02.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:03.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:07.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:08.698 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:46:08.699 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:46:08.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:12.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:13.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:17.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:18.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:22.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:24.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:27.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:29.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:32.104 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:46:32.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:34.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:37.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:39.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:42.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:44.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:47.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:49.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:52.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:54.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:57.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:46:59.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:02.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:04.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:08.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5bb0569b753a428bad84d5c492c55b1e
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host,
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5bb0569b753a428bad84d5c492c55b1e
2025-10-14 08:47:08.705 2 ERROR oslo_service.periodic_task
2025-10-14 08:47:08.707 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:47:08.708 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514
2025-10-14 08:47:08.708 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518
2025-10-14 08:47:09.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:13.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:14.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:18.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:19.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:23.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:24.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:28.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:29.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:32.111 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:47:33.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:34.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:38.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:39.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:43.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:44.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:48.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:49.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:53.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:54.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:58.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:47:59.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:03.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:04.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:08.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID ad0b3c0461f54bc0b99fcb68a9fb1521
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID ad0b3c0461f54bc0b99fcb68a9fb1521
2025-10-14 08:48:08.715 2 ERROR oslo_service.periodic_task
2025-10-14 08:48:08.716 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:48:08.717 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:48:08.717 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:48:08.717 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:48:09.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:13.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:14.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:18.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:19.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:23.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:24.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:28.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:29.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:32.119 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:48:33.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:34.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:38.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:39.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:43.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:44.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:48.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:50.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:53.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:55.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:48:58.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:00.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:03.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:05.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:08.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e28de28fcfc9477b9ffe0e25714f788c
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e28de28fcfc9477b9ffe0e25714f788c
2025-10-14 08:49:08.724 2 ERROR oslo_service.periodic_task
2025-10-14 08:49:08.725 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:49:08.726 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:49:10.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:13.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:15.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:18.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:20.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:23.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:25.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:28.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:30.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:32.127 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:49:33.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:35.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:38.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:40.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:43.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:45.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:48.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:50.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:53.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:55.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:49:58.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:00.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:03.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:05.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:08.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f40dbf26dbc54ca68d9f39731a27783f
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host,
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f40dbf26dbc54ca68d9f39731a27783f
2025-10-14 08:50:08.732 2 ERROR oslo_service.periodic_task
2025-10-14 08:50:08.734 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:50:08.735 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131
2025-10-14 08:50:08.735 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:50:10.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:13.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:15.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:18.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:20.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:23.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:25.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:28.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:30.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:32.134 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:50:33.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:35.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:38.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:40.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:43.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:45.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:48.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:50.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:53.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:55.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:50:58.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:00.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:03.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:05.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:08.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f0949fbcf85049e7b9cb5ef601024bee
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f0949fbcf85049e7b9cb5ef601024bee
2025-10-14 08:51:08.743 2 ERROR oslo_service.periodic_task
2025-10-14 08:51:08.744 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:51:08.745 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780
2025-10-14 08:51:10.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:13.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:15.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:18.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:20.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:23.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:25.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:28.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:30.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:32.140 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:51:33.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:35.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:38.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:40.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:43.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:45.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:48.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:50.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:53.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:55.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:51:58.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:00.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:03.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:05.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 60b996ae52ae442a8d1673548b551708
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters(
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 60b996ae52ae442a8d1673548b551708
2025-10-14 08:52:08.751 2 ERROR oslo_service.periodic_task
2025-10-14 08:52:08.753 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:52:08.753 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818
2025-10-14 08:52:08.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:10.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:14.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:15.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:19.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:20.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:24.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:25.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:29.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:30.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:32.147 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:52:34.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:35.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:39.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:40.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:44.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:45.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:49.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:50.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:54.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:55.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:52:59.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:00.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:04.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:05.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 62d7cd474dce4b8ea2792c875a5a6529
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context,
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 62d7cd474dce4b8ea2792c875a5a6529
2025-10-14 08:53:08.761 2 ERROR oslo_service.periodic_task
2025-10-14 08:53:08.762 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:53:09.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:10.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:14.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:15.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:19.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:20.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:24.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:25.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:29.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:30.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:32.154 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:53:34.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:36.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:39.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:41.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:44.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:46.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:49.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:51.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:54.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:56.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:53:59.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:01.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:04.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:06.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 29604e63f4664769b541d0e4d57fe971
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context)
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 29604e63f4664769b541d0e4d57fe971
2025-10-14 08:54:08.768 2 ERROR oslo_service.periodic_task
2025-10-14 08:54:09.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:11.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:14.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:16.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:20.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:21.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:25.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:26.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:30.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:31.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:32.161 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:54:35.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:36.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:40.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:41.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:45.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:46.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:50.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:51.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:55.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:54:56.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:00.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:01.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:05.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:05.968 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:55:05.968 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:55:06.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:10.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:11.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:15.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:16.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:20.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:21.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:25.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:26.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:30.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:31.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:32.167 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:55:35.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:36.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:40.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:41.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:45.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:46.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:50.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:52.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:55.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:55:57.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:00.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:02.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:05.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 196d380686ae4734a9efb5262e5f5901
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host,
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 196d380686ae4734a9efb5262e5f5901
2025-10-14 08:56:05.975 2 ERROR oslo_service.periodic_task
2025-10-14 08:56:05.976 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:56:05.976 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514
2025-10-14 08:56:05.977 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518
2025-10-14 08:56:07.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:10.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:12.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:15.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:17.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:20.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:22.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:25.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:27.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:30.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:32.174 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:56:32.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:35.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:37.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:40.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:42.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:45.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:47.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:50.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:52.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:55.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:56:57.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:00.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:02.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:05.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 33bde3fda2c44e0a8057c4d93125e070
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 33bde3fda2c44e0a8057c4d93125e070
2025-10-14 08:57:05.982 2 ERROR oslo_service.periodic_task
2025-10-14 08:57:05.984 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:57:05.984 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:57:05.984 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:57:05.985 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:57:07.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:10.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:12.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:15.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:17.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:20.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:22.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:25.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:27.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:30.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:32.181 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:57:32.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:35.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:37.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:40.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:42.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:45.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:47.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:50.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:52.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:55.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:57:58.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:00.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:03.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:05.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID eccaabe6ea054b03aca35f345bdb4e96
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end,
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs)
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID eccaabe6ea054b03aca35f345bdb4e96
2025-10-14 08:58:05.989 2 ERROR oslo_service.periodic_task
2025-10-14 08:58:05.992 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:58:05.992 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:58:08.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:10.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:13.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:15.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:18.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:20.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:23.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:25.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:28.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:30.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:32.188 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:58:33.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:35.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:38.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:40.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:43.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:48.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:50.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:53.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:55.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:58:58.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:00.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:03.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 85da4335af1f4dbc89b3fa77eba552be
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host,
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 85da4335af1f4dbc89b3fa77eba552be
2025-10-14 08:59:06.001 2 ERROR oslo_service.periodic_task
2025-10-14 08:59:06.003 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:59:06.003 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131
2025-10-14 08:59:06.003 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 08:59:06.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:08.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:11.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:13.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:16.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:18.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:21.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:23.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:26.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:28.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:31.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:32.196 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 08:59:33.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:35.514 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer
2025-10-14 08:59:35.719 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-10-14 08:59:35.731 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:36.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:36.531 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:37.637 2 INFO oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] Reconnected to AMQP server on np0005486748.internalapi.ooo.test:5672 via [amqp] client with port 57108.
2025-10-14 08:59:38.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:41.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:41.421 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer
2025-10-14 08:59:42.432 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:43.500 2 INFO oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] Reconnected to AMQP server on np0005486746.internalapi.ooo.test:5672 via [amqp] client with port 53592.
2025-10-14 08:59:43.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:44.494 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-10-14 08:59:44.514 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:44.535 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:46.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:48.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:49.792 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 1 seconds.: amqp.exceptions.ConnectionForced: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'
2025-10-14 08:59:49.793 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 1 seconds.: amqp.exceptions.ConnectionForced: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'
2025-10-14 08:59:50.813 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:50.814 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-10-14 08:59:50.823 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:50.823 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:50.835 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:50.846 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:51.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:51.835 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:51.836 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:51.857 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:51.867 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:51.876 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:52.867 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:52.868 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:53.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:54.901 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:54.902 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:54.912 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:54.923 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:54.934 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:55.931 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:55.932 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:56.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:56.960 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:56.961 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:58.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 08:59:59.624 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer
2025-10-14 08:59:59.633 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:59.641 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:59.648 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:59.972 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:59.979 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 08:59:59.987 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:00.661 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:00.671 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:00.679 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:00.997 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:00.998 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:01.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:02.033 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:02.034 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.065 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.066 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.699 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.713 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.728 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:03.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e806842bb4364384b6b32b6bcdb0675f
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout)
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return waiter.wait()
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return get_hub().switch()
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return self.greenlet.switch()
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task _queue.Empty
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred:
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task Traceback (most recent call last):
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task task(self, context)
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions(
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions',
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout,
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout)
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout(
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID e806842bb4364384b6b32b6bcdb0675f
2025-10-14 09:00:06.012 2 ERROR oslo_service.periodic_task
2025-10-14 09:00:06.013 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 09:00:06.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:07.034 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:07.049 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:07.063 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:08.749 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:08.763 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:08.777 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:08.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:09.089 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:09.090 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:10.107 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:10.108 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:11.129 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:11.129 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:11.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:13.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:15.801 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:15.817 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:15.831 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:16.087 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:16.103 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:16.119 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:16.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:18.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:19.149 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:19.150 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:20.183 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:20.184 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:21.205 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:21.207 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:21.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:24.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:24.858 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:24.872 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:24.887 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:26.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:27.147 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:27.162 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:27.177 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:29.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:31.233 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:31.234 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:31.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:32.203 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec
2025-10-14 09:00:32.262 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:32.263 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:33.290 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:33.291 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:34.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:35.915 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:35.930 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:35.946 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:36.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:39.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:40.200 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:40.209 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:40.222 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:41.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:44.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:45.329 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:45.330 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:46.348 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:46.349 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:46.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:47.378 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:47.379 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:48.978 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:48.993 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:49.009 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:49.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:51.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:54.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:55.262 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:55.279 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:55.295 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:00:56.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:00:59.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:01.435 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:01.436 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:01.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:02.466 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:02.467 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:03.489 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:03.490 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:04.060 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:04.079 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:04.095 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:04.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:06.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:09.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:12.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:12.340 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:12.356 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:12.373 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:14.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:17.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:19.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:19.522 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:19.523 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:20.551 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:20.553 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:21.135 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:21.146 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:21.156 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:21.567 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:21.567 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:22.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:24.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:27.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:29.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:31.412 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:31.426 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:31.441 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:32.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:34.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:37.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:39.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:39.614 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:39.615 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:40.196 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:40.211 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:40.227 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:40.643 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:40.644 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:41.674 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:41.675 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:42.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:44.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:47.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:49.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:52.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:52.480 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:52.495 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:52.513 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:01:54.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:57.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:01:59.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:01.320 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:01.336 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:01.351 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:01.727 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:01.728 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:02.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:02.762 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:02.763 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:03.783 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:03.784 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:04.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:07.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:09.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:12.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:14.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:15.572 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:15.587 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:15.601 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:17.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:19.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:22.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:24.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:24.402 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:24.409 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:24.416 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:25.840 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:25.842 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486746.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:26.871 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:26.873 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 1 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:27.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:27.886 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] AMQP server on np0005486749.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:27.887 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] AMQP server on np0005486748.internalapi.ooo.test:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:29.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:32.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:34.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:37.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:39.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:40.646 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED
2025-10-14 09:02:42.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:44.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:47.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:49.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:52.022 2 INFO oslo.messaging._drivers.impl_rabbit [-] [f8c1d6c3-262c-4d26-887b-215068dda01a] Reconnected to AMQP server on np0005486746.internalapi.ooo.test:5672 via [amqp] client with port 51228.
2025-10-14 09:02:52.025 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 09:02:52.027 2 DEBUG oslo_concurrency.lockutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355
2025-10-14 09:02:52.027 2 DEBUG oslo_concurrency.lockutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Lock "storage-registry-lock" released by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367
2025-10-14 09:02:52.028 2 DEBUG oslo_concurrency.lockutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355
2025-10-14 09:02:52.028 2 DEBUG oslo_concurrency.lockutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Lock "storage-registry-lock" released by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367
2025-10-14 09:02:52.049 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100
2025-10-14 09:02:52.053 2 INFO oslo.messaging._drivers.impl_rabbit [-] [1e80a8de-1af7-4e28-b6a6-b6d8be34363b] Reconnected to AMQP server on np0005486746.internalapi.ooo.test:5672 via [amqp] client with port 51238.
2025-10-14 09:02:52.074 2 INFO nova.servicegroup.drivers.db [-] Recovered from being unable to report status.
2025-10-14 09:02:52.074 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 129.87 sec
2025-10-14 09:02:52.689 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
2025-10-14 09:02:52.689 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Image id e79d68ed-cc3d-451d-9065-86232671e699 yields fingerprint b0b7b09af12e285a0c0f8a5869b5b75155587d41 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
2025-10-14 09:02:52.690 2 INFO nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] image e79d68ed-cc3d-451d-9065-86232671e699 at (/var/lib/nova/instances/_base/b0b7b09af12e285a0c0f8a5869b5b75155587d41): checking
2025-10-14 09:02:52.690 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] image e79d68ed-cc3d-451d-9065-86232671e699 at (/var/lib/nova/instances/_base/b0b7b09af12e285a0c0f8a5869b5b75155587d41): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
2025-10-14 09:02:52.692 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Image id yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
2025-10-14 09:02:52.692 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] 556cc523-273b-4aa5-a2de-446f1aaace83 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
2025-10-14 09:02:52.693 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] 556cc523-273b-4aa5-a2de-446f1aaace83 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
2025-10-14 09:02:52.693 2 DEBUG oslo_concurrency.processutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/556cc523-273b-4aa5-a2de-446f1aaace83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
2025-10-14 09:02:52.776 2 DEBUG oslo_concurrency.processutils [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/556cc523-273b-4aa5-a2de-446f1aaace83/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
2025-10-14 09:02:52.777 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Instance 556cc523-273b-4aa5-a2de-446f1aaace83 is backed by b0b7b09af12e285a0c0f8a5869b5b75155587d41 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
2025-10-14 09:02:52.778 2 INFO nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Active base files: /var/lib/nova/instances/_base/b0b7b09af12e285a0c0f8a5869b5b75155587d41
2025-10-14 09:02:52.778 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
2025-10-14 09:02:52.778 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
2025-10-14 09:02:52.779 2 DEBUG nova.virt.libvirt.imagecache [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
2025-10-14 09:02:52.780 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 09:02:52.781 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780
2025-10-14 09:02:52.805 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789
2025-10-14 09:02:52.806 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 09:02:52.806 2 DEBUG nova.compute.manager [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818
2025-10-14 09:02:52.822 2 DEBUG oslo_service.periodic_task [req-d1fc3d45-22fa-4545-8b1c-bfe78629ecc8 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
2025-10-14 09:02:52.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:54.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:57.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:02:59.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:02.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:04.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:07.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:09.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:13.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:14.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:18.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:19.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:23.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:24.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:28.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:29.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:33.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:34.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:38.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
2025-10-14 09:03:39.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263