2025-10-09 14:34:41.550 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-09 14:34:41.550 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-09 14:34:41.551 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-09 14:34:41.551 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs 2025-10-09 14:34:41.613 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:34:41.636 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:34:42.120 2 INFO nova.virt.driver [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Loading compute driver 'libvirt.LibvirtDriver' 2025-10-09 14:34:42.284 2 INFO nova.compute.provider_config [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] No provider configs found in /etc/nova/provider_config. If files are present, ensure the Nova process has access. 2025-10-09 14:34:42.298 2 DEBUG oslo_concurrency.lockutils [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:34:42.299 2 DEBUG oslo_concurrency.lockutils [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:34:42.299 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:363 2025-10-09 14:34:42.299 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2593 2025-10-09 14:34:42.299 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 2025-10-09 14:34:42.299 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2595 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] config files: ['/usr/share/nova/nova-dist.conf', '/etc/nova/nova.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2596 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2598 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.300 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.301 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] config_dir = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] config_file = ['/usr/share/nova/nova-dist.conf', '/etc/nova/nova.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] console_host = standalone.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cpu_allocation_ratio = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.302 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.303 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] disk_allocation_ratio = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.304 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] force_config_drive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] host = standalone.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] initial_cpu_allocation_ratio = 16.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.305 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] initial_disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] initial_ram_allocation_ratio = 1.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] injected_network_template = /usr/share/nova/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.306 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_usage_audit = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_usage_audit_period = hour log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.307 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_dir = /var/log/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.308 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.309 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] logging_user_identity_format = %(user)s %(tenant)s %(domain)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.310 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] mkisofs_cmd = mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.311 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] my_block_storage_ip = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] my_ip = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.312 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.313 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.314 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.315 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.316 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.317 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.318 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.319 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.320 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.321 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.321 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.321 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.321 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.321 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-09 14:34:42.322 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.322 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.322 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.322 2 WARNING oslo_config.cfg [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Deprecated: Option "dhcp_domain" from group "DEFAULT" is deprecated. Use option "dhcp_domain" from group "api". 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.323 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.324 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.325 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.326 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.326 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.backend = dogpile.cache.memcached log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.326 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.326 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.326 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.327 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.327 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.327 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.327 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.327 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.328 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_servers = ['standalone.internalapi.localdomain:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.329 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.330 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.auth_type = v3password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.331 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.332 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.333 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.packing_host_numa_cells_allocation_strategy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.provider_config_location = /etc/nova/provider_config log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.334 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.335 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.335 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.335 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.335 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.335 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.336 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.337 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.337 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.337 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.337 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.337 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.338 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.339 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.max_overflow = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.max_pool_size = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.340 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] devices.enabled_vgpu_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.341 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.342 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.343 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.344 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.345 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.345 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.345 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.345 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.346 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.347 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.347 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.347 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.347 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.347 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.348 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.349 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.350 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.350 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.350 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.350 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.350 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.351 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.352 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.353 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.354 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.auth_endpoint = http://172.17.0.2:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.355 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.barbican_endpoint = http://172.17.0.2:9311 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.barbican_endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.356 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.357 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.358 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.359 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.360 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.360 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.360 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.360 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.360 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.361 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.361 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.361 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.361 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.361 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.362 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.363 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.disk_cachemodes = ['network=writeback'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.364 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.hw_disk_discard = unmap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.hw_machine_type = ['x86_64=pc-q35-rhel9.0.0'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.365 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.366 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.367 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.367 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.367 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.367 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.367 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.368 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.368 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.368 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_inbound_addr = standalone.internalapi.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_timeout_action = abort log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.369 2 WARNING oslo_config.cfg [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Deprecated: Option "live_migration_tunnelled" from group "libvirt" is deprecated for removal ( The "tunnelled live migration" has two inherent limitations: it cannot handle live migration of disks in a non-shared storage setup; and it has a huge performance cost. Both these problems are solved by ``live_migration_with_native_tls`` (requires a pre-configured TLS environment), which is the recommended approach for securing all live migration streams.). Its value may be silently ignored in the future. 2025-10-09 14:34:42.370 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.370 2 WARNING oslo_config.cfg [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( live_migration_uri is deprecated for removal in favor of two other options that allow to change live migration scheme and target URI: ``live_migration_scheme`` and ``live_migration_inbound_addr`` respectively. ). Its value may be silently ignored in the future. 2025-10-09 14:34:42.370 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_uri = qemu+ssh://nova_migration@%s:2022/system?keyfile=/etc/nova/migration/identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.370 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.370 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.371 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.371 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.371 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.371 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.371 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_pcie_ports = 16 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.372 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rbd_secret_uuid = 2d56dca7-2cfb-5a40-a06c-893cfa606b8c log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.373 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.374 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.375 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.swtpm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.376 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.volume_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.377 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.378 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.378 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.378 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.378 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.378 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.auth_type = v3password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.379 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.default_floating_pool = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.380 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.381 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.service_metadata_proxy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.382 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.timeout = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.383 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.383 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.383 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.383 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.383 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.384 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.384 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.384 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.384 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] pci.passthrough_whitelist = ['{"devname":"dummy-dev","physical_network":"dummy_sriov_net"}'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.384 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.auth_url = http://172.17.0.2:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.385 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.386 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.387 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.388 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.389 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.username = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] powervm.disk_driver = localdisk log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] powervm.proc_units_factor = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] powervm.volume_group_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.390 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.391 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.391 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.391 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.391 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.391 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.392 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.393 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.query_placement_for_availability_zone = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.394 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.395 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.395 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.395 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.395 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.395 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.396 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.397 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.398 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.399 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.400 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.401 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.402 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] upgrade_levels.compute = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.403 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.404 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.405 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.406 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.407 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.408 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.409 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.409 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.409 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.409 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.409 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.novncproxy_base_url = http://172.21.0.2:6080/vnc_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.server_listen = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.server_proxyclient_address = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.410 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.411 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_native_luksv1 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.412 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.412 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.412 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.enable_qemu_monitor_announce_self = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.412 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.412 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.rbd_volume_local_attach = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.reserve_disk_resource_for_image_cache = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.413 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.skip_cpu_compare_on_dest = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.414 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.415 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.416 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.416 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.416 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.417 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.418 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.419 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.use_db_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] database.use_tpool = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.420 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.421 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.422 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.423 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.424 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.425 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.426 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.427 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.428 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.429 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.default_qos_type = linux-noop log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.430 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.431 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.432 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-09 14:34:42.433 2 DEBUG oslo_service.service [req-6fe01aeb-93bf-486b-87f1-0bd9a57c39e2 - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2617 2025-10-09 14:34:42.435 2 INFO nova.service [-] Starting compute node (version 23.2.3-17.1.20250522071028.2ace99d.el9ost) 2025-10-09 14:34:42.450 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:491 2025-10-09 14:34:42.451 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:497 2025-10-09 14:34:42.451 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:619 2025-10-09 14:34:42.451 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:502 2025-10-09 14:34:42.458 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:508 2025-10-09 14:34:42.462 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:529 2025-10-09 14:34:42.463 2 INFO nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Connection event '1' reason 'None' 2025-10-09 14:34:42.494 2 WARNING nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Cannot update service status on host "standalone.localdomain" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host standalone.localdomain could not be found. 2025-10-09 14:34:42.494 2 DEBUG nova.virt.libvirt.volume.mount [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130 2025-10-09 14:34:43.263 2 INFO nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host capabilities da7dd545-e3fc-420a-b1b5-9235ad14ded1 x86_64 EPYC-Rome AMD tcp rdma 16116612 4029153 0 0 selinux 0 system_u:system_r:svirt_t:s0 system_u:system_r:svirt_tcg_t:s0 dac 0 +107:+107 +107:+107 hvm 32 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel8.6.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 q35 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 hvm 64 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel8.6.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 q35 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 2025-10-09 14:34:43.268 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:939 2025-10-09 14:34:43.293 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: /usr/libexec/qemu-kvm kvm pc-i440fx-rhel7.6.0 i686 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun ide fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-09 14:34:43.295 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.2.0 i686 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-09 14:34:43.296 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35', 'pc-q35-rhel9.0.0'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:939 2025-10-09 14:34:43.298 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: /usr/libexec/qemu-kvm kvm pc-i440fx-rhel7.6.0 x86_64 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun ide fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-09 14:34:43.301 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.2.0 x86_64 efi /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd /usr/share/edk2/ovmf/OVMF_CODE.fd /usr/share/edk2/ovmf/OVMF.amdsev.fd /usr/share/edk2/ovmf/OVMF.inteltdx.fd rom pflash yes no yes no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-09 14:34:43.303 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc-q35-rhel9.0.0: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.0.0 x86_64 efi /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd /usr/share/edk2/ovmf/OVMF_CODE.fd /usr/share/edk2/ovmf/OVMF.amdsev.fd /usr/share/edk2/ovmf/OVMF.inteltdx.fd rom pflash yes no yes no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-09 14:34:43.303 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1592 2025-10-09 14:34:43.303 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1592 2025-10-09 14:34:43.304 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1592 2025-10-09 14:34:43.304 2 INFO nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Secure Boot support detected 2025-10-09 14:34:43.304 2 INFO nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use. 2025-10-09 14:34:43.304 2 INFO nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use. 2025-10-09 14:34:43.342 2 INFO nova.virt.node [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Generated node identity 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:34:43.342 2 INFO nova.virt.node [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Wrote node identity 019ad6c7-b66d-4a42-8fad-340daa8e4d4a to /var/lib/nova/compute_id 2025-10-09 14:34:43.358 2 WARNING nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Compute nodes ['019ad6c7-b66d-4a42-8fad-340daa8e4d4a'] for host standalone.localdomain were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. 2025-10-09 14:34:43.388 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host 2025-10-09 14:34:43.411 2 WARNING nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] No compute node record found for host standalone.localdomain. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host standalone.localdomain could not be found. 2025-10-09 14:34:43.412 2 DEBUG oslo_concurrency.lockutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:34:43.412 2 DEBUG oslo_concurrency.lockutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:34:43.412 2 DEBUG nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:34:43.412 2 DEBUG oslo_concurrency.processutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:34:43.859 2 DEBUG oslo_concurrency.processutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:34:44.074 2 WARNING nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:34:44.076 2 DEBUG nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4703MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:34:44.076 2 DEBUG oslo_concurrency.lockutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:34:44.088 2 WARNING nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] No compute node record for standalone.localdomain:019ad6c7-b66d-4a42-8fad-340daa8e4d4a: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 019ad6c7-b66d-4a42-8fad-340daa8e4d4a could not be found. 2025-10-09 14:34:44.107 2 INFO nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Compute node record created for standalone.localdomain:standalone.localdomain with uuid: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:34:44.151 2 DEBUG nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:34:44.152 2 DEBUG nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:34:44.572 2 INFO nova.scheduler.client.report [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [req-289e2f3e-5bee-461a-9d30-633ca19eff9e] Created resource provider record via placement API for resource provider with UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a and name standalone.localdomain. 2025-10-09 14:34:44.573 2 DEBUG oslo_concurrency.processutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:34:45.035 2 DEBUG oslo_concurrency.processutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:34:45.041 2 DEBUG nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] /sys/module/kvm_amd/parameters/sev contains [N ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1613 2025-10-09 14:34:45.042 2 INFO nova.virt.libvirt.host [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] kernel doesn't support AMD SEV 2025-10-09 14:34:45.043 2 DEBUG nova.compute.provider_tree [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Updating inventory in ProviderTree for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 6, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-09 14:34:45.044 2 DEBUG nova.virt.libvirt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-09 14:34:45.102 2 DEBUG nova.scheduler.client.report [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Updated inventory for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 6, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957 2025-10-09 14:34:45.102 2 DEBUG nova.compute.provider_tree [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Updating resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:163 2025-10-09 14:34:45.102 2 DEBUG nova.compute.provider_tree [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Updating inventory in ProviderTree for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-09 14:34:45.204 2 DEBUG nova.compute.provider_tree [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Updating resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:163 2025-10-09 14:34:45.204 2 DEBUG nova.compute.resource_tracker [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:34:45.204 2 DEBUG oslo_concurrency.lockutils [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:34:45.205 2 DEBUG nova.service [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182 2025-10-09 14:34:45.235 2 DEBUG nova.service [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199 2025-10-09 14:34:45.236 2 DEBUG nova.servicegroup.drivers.db [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] DB_Driver: join new ServiceGroup member standalone.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44 2025-10-09 14:35:24.238 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:24.255 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.660 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:35:41.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:35:41.741 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:35:41.741 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.742 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.743 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.743 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:41.803 2 INFO nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running instance usage audit for host standalone.localdomain from 2025-10-09 13:00:00 to 2025-10-09 14:00:00. 0 instances. 2025-10-09 14:35:42.016 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:42.017 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:42.018 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:35:42.018 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:35:42.039 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:35:42.039 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:35:42.039 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:35:42.040 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:35:42.563 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:35:42.790 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:35:42.791 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4399MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:35:42.791 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:35:42.844 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:35:42.844 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:35:42.871 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:35:43.377 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:35:43.468 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:35:43.480 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:35:43.481 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:35:43.481 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:36:43.403 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.413 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.488 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.488 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:36:43.488 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:36:43.506 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:36:43.506 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.507 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.507 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.507 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:36:43.507 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:43.554 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:36:43.554 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:36:43.554 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:36:43.555 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:36:44.376 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.821s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:36:44.694 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:36:44.695 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4562MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:36:44.695 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:36:44.750 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:36:44.751 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:36:44.752 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:36:45.281 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:36:45.286 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:36:45.310 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:36:45.310 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:36:45.311 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:36:45.531 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:45.532 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:36:45.546 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:42.725 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:42.727 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:37:42.727 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:37:42.744 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:37:42.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:42.767 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:37:42.767 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:37:42.767 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:37:42.768 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:37:43.249 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:37:43.447 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:37:43.449 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4630MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:37:43.449 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:37:43.517 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:37:43.517 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:37:43.552 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:37:44.063 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:37:44.070 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:37:44.086 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:37:44.087 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:37:44.087 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:37:45.064 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.065 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.065 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.065 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.065 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.066 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:37:45.066 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:37:45.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:41.671 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:43.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:38:44.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:38:44.744 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:38:44.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.745 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:38:44.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:44.765 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:38:44.765 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:38:44.765 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:38:44.766 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:38:45.215 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:38:45.415 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:38:45.417 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4496MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:38:45.417 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:38:45.517 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:38:45.518 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:38:45.520 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:38:45.953 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:38:45.980 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:38:46.003 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:38:46.004 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:38:46.005 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:38:46.982 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:46.983 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:38:46.995 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:41.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:41.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 14:39:41.735 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-09 14:39:41.738 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:41.738 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 14:39:41.745 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:44.750 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:44.750 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:44.768 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:39:44.768 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:39:44.768 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:39:44.769 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:39:45.252 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:39:45.434 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:39:45.435 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4547MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:39:45.435 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:39:45.532 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:39:45.532 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:39:45.551 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing inventories for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804 2025-10-09 14:39:45.568 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Updating ProviderTree inventory for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768 2025-10-09 14:39:45.569 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Updating inventory in ProviderTree for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-09 14:39:45.588 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing aggregate associations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813 2025-10-09 14:39:45.609 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing trait associations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a, traits: COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_BMI2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AMD_SVM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_F16C _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825 2025-10-09 14:39:45.610 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:39:46.046 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:39:46.058 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:39:46.108 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:39:46.109 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:39:46.109 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:39:47.080 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.081 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.081 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:39:47.082 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:39:47.096 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:39:47.097 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.098 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.098 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.108 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:39:47.109 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:39:47.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:44.643 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:44.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:45.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:45.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:45.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:40:46.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:46.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:40:46.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:40:46.739 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:40:46.739 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:46.740 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:46.769 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:40:46.770 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:40:46.770 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:40:46.771 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:40:47.289 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:40:47.454 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:40:47.456 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4591MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:40:47.456 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:40:47.508 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:40:47.508 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:40:47.510 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:40:47.951 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:40:47.958 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:40:47.980 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:40:47.980 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:40:47.981 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:40:48.901 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:48.902 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:40:48.913 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:45.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:45.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:45.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:41:46.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:46.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:46.741 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:41:46.742 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:41:46.742 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:41:46.742 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:41:47.192 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:41:47.435 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:41:47.437 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4682MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:41:47.438 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:41:47.508 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:41:47.508 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:41:47.512 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:41:47.935 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:41:47.942 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:41:47.978 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:41:47.979 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:41:47.980 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:41:48.978 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:48.979 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:41:48.979 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:41:48.997 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:41:48.998 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:48.998 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:49.674 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:41:49.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:45.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:45.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:42:46.724 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:47.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:48.642 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:48.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:48.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:42:48.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:42:48.737 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-09 14:42:48.738 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:48.755 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:42:48.755 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:42:48.755 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:42:48.756 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:42:49.294 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:42:49.490 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:42:49.491 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4492MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:42:49.491 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:42:49.548 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:42:49.548 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:42:49.550 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:42:50.059 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:42:50.066 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:42:50.081 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:42:50.082 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:42:50.082 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:42:51.067 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:51.068 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:51.085 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:42:51.661 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:16.232 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:16.249 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2306 2025-10-09 14:43:16.379 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:16.386 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2200 2025-10-09 14:43:16.387 2 INFO nova.compute.claims [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Claim successful on node standalone.localdomain 2025-10-09 14:43:16.505 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:16.914 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:16.923 2 DEBUG nova.compute.provider_tree [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:43:16.942 2 DEBUG nova.scheduler.client.report [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:43:16.943 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:16.944 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2679 2025-10-09 14:43:17.035 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1845 2025-10-09 14:43:17.035 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1020 2025-10-09 14:43:17.053 2 DEBUG nova.block_device [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] block_device_list [] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-09 14:43:17.068 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2714 2025-10-09 14:43:17.187 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2510 2025-10-09 14:43:17.188 2 DEBUG nova.block_device [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-09 14:43:17.189 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4592 2025-10-09 14:43:17.190 2 INFO nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Creating image 2025-10-09 14:43:17.248 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:17.281 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:17.313 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:17.318 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "a427375a4beb6fb8f61d3d5d340c3b9923e14f3b" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:17.457 2 DEBUG nova.virt.libvirt.imagebackend [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Image locations are: [{'url': 'rbd://2d56dca7-2cfb-5a40-a06c-893cfa606b8c/images/ff8a5cba-0182-4533-98fc-6ff2255d8ad5/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2d56dca7-2cfb-5a40-a06c-893cfa606b8c/images/ff8a5cba-0182-4533-98fc-6ff2255d8ad5/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1046 2025-10-09 14:43:17.522 2 WARNING oslo_policy.policy [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html. 2025-10-09 14:43:17.522 2 WARNING oslo_policy.policy [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html. 2025-10-09 14:43:17.901 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:18.010 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.part --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:18.012 2 DEBUG nova.virt.images [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] ff8a5cba-0182-4533-98fc-6ff2255d8ad5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242 2025-10-09 14:43:18.014 2 DEBUG nova.privsep.utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63 2025-10-09 14:43:18.015 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.part /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:18.107 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Successfully created port: fb9e0330-dc2b-4a7f-ac53-9170e023695a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:549 2025-10-09 14:43:18.267 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.part /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.converted" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:18.271 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:18.338 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:18.340 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "a427375a4beb6fb8f61d3d5d340c3b9923e14f3b" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:18.374 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:18.379 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b f9af0e26-e2a2-439e-9de7-367991eb09d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:19.355 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Successfully updated port: fb9e0330-dc2b-4a7f-ac53-9170e023695a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:587 2025-10-09 14:43:19.374 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:43:19.375 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1872 2025-10-09 14:43:19.436 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3049 2025-10-09 14:43:19.669 2 DEBUG nova.network.neutron [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:43:19.694 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:43:19.695 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Instance network_info: |[{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1859 2025-10-09 14:43:19.741 2 DEBUG nova.compute.manager [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Received event network-changed-fb9e0330-dc2b-4a7f-ac53-9170e023695a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:43:19.741 2 DEBUG nova.compute.manager [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Refreshing instance network info cache due to event network-changed-fb9e0330-dc2b-4a7f-ac53-9170e023695a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-09 14:43:19.741 2 DEBUG oslo_concurrency.lockutils [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:43:19.742 2 DEBUG nova.network.neutron [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Refreshing network info cache for port fb9e0330-dc2b-4a7f-ac53-9170e023695a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-09 14:43:20.139 2 DEBUG nova.network.neutron [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated VIF entry in instance network info cache for port fb9e0330-dc2b-4a7f-ac53-9170e023695a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-09 14:43:20.140 2 DEBUG nova.network.neutron [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:43:20.156 2 DEBUG oslo_concurrency.lockutils [req-2ba95ab2-c7fe-49c9-a5de-03a682c3e8ad e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:43:20.799 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a427375a4beb6fb8f61d3d5d340c3b9923e14f3b f9af0e26-e2a2-439e-9de7-367991eb09d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:20.890 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] resizing rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:286 2025-10-09 14:43:21.017 2 DEBUG nova.objects.instance [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lazy-loading 'migration_context' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:43:21.065 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:21.090 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:21.094 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:21.094 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:21.117 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:21.118 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:21.158 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:21.159 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "ephemeral_1_0706d66" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:21.181 2 DEBUG nova.storage.rbd_utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image f9af0e26-e2a2-439e-9de7-367991eb09d8_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:43:21.185 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 f9af0e26-e2a2-439e-9de7-367991eb09d8_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:21.962 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 f9af0e26-e2a2-439e-9de7-367991eb09d8_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.776s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:22.088 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719 2025-10-09 14:43:22.089 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Ensure instance console log exists: /var/lib/nova/instances/f9af0e26-e2a2-439e-9de7-367991eb09d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4482 2025-10-09 14:43:22.089 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:22.090 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "vgpu_resources" released by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:22.091 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Start _get_guest_xml network_info=[{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'type': 'disk', 'dev': 'vda', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'type': 'disk', 'dev': 'vda', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2025-10-09T14:42:12Z,direct_url=,disk_format='qcow2',id=ff8a5cba-0182-4533-98fc-6ff2255d8ad5,min_disk=0,min_ram=0,name='cirros',owner='c66b981ab228418cac8c3cbc73ada639',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2025-10-09T14:42:16Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'ephemerals': [{'device_name': '/dev/vdb', 'device_type': 'disk', 'disk_bus': 'virtio', 'size': 1, 'guest_format': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7177 2025-10-09 14:43:22.096 2 WARNING nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:43:22.098 2 DEBUG nova.virt.libvirt.host [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1463 2025-10-09 14:43:22.099 2 DEBUG nova.virt.libvirt.host [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1473 2025-10-09 14:43:22.100 2 DEBUG nova.virt.libvirt.host [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1482 2025-10-09 14:43:22.101 2 DEBUG nova.virt.libvirt.host [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1489 2025-10-09 14:43:22.101 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-09 14:43:22.102 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-09T14:42:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='dfc79ac3-5f0f-42f3-94a9-2ab50a6911df',id=3,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2025-10-09T14:42:12Z,direct_url=,disk_format='qcow2',id=ff8a5cba-0182-4533-98fc-6ff2255d8ad5,min_disk=0,min_ram=0,name='cirros',owner='c66b981ab228418cac8c3cbc73ada639',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2025-10-09T14:42:16Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:558 2025-10-09 14:43:22.102 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:343 2025-10-09 14:43:22.102 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:347 2025-10-09 14:43:22.102 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:383 2025-10-09 14:43:22.103 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:387 2025-10-09 14:43:22.103 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:425 2025-10-09 14:43:22.103 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:564 2025-10-09 14:43:22.103 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:466 2025-10-09 14:43:22.104 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:496 2025-10-09 14:43:22.104 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:570 2025-10-09 14:43:22.104 2 DEBUG nova.virt.hardware [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:572 2025-10-09 14:43:22.105 2 DEBUG nova.privsep.utils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63 2025-10-09 14:43:22.106 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:22.529 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:22.530 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:22.990 2 DEBUG oslo_concurrency.processutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:22.993 2 DEBUG nova.virt.libvirt.vif [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-09T14:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='test',display_name='test',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='test',id=1,image_ref='ff8a5cba-0182-4533-98fc-6ff2255d8ad5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c66b981ab228418cac8c3cbc73ada639',ramdisk_id='',reservation_id='r-fx6g9jfk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='ff8a5cba-0182-4533-98fc-6ff2255d8ad5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-09T14:43:17Z,user_data=None,user_id='edbfdb44c4234cc2878badaa44d34c1e',uuid=f9af0e26-e2a2-439e-9de7-367991eb09d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:566 2025-10-09 14:43:22.993 2 DEBUG nova.network.os_vif_util [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converting VIF {"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-09 14:43:22.995 2 DEBUG nova.network.os_vif_util [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:f1,bridge_name='br-int',has_traffic_filtering=True,id=fb9e0330-dc2b-4a7f-ac53-9170e023695a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9e0330-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-09 14:43:22.998 2 DEBUG nova.objects.instance [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lazy-loading 'pci_devices' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:43:23.013 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] End _get_guest_xml xml= f9af0e26-e2a2-439e-9de7-367991eb09d8 instance-00000001 524288 1 test 2025-10-09 14:43:22 512 1 0 1 1 admin admin Red Hat OpenStack Compute 23.2.3-17.1.20250522071028.2ace99d.el9ost f9af0e26-e2a2-439e-9de7-367991eb09d8 f9af0e26-e2a2-439e-9de7-367991eb09d8 Virtual Machine hvm /dev/urandom _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7183 2025-10-09 14:43:23.014 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Preparing to wait for external event network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:282 2025-10-09 14:43:23.014 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:23.014 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" released by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:23.015 2 DEBUG nova.virt.libvirt.vif [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-09T14:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='test',display_name='test',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='test',id=1,image_ref='ff8a5cba-0182-4533-98fc-6ff2255d8ad5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c66b981ab228418cac8c3cbc73ada639',ramdisk_id='',reservation_id='r-fx6g9jfk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='ff8a5cba-0182-4533-98fc-6ff2255d8ad5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-09T14:43:17Z,user_data=None,user_id='edbfdb44c4234cc2878badaa44d34c1e',uuid=f9af0e26-e2a2-439e-9de7-367991eb09d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:712 2025-10-09 14:43:23.015 2 DEBUG nova.network.os_vif_util [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converting VIF {"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-09 14:43:23.017 2 DEBUG nova.network.os_vif_util [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:f1,bridge_name='br-int',has_traffic_filtering=True,id=fb9e0330-dc2b-4a7f-ac53-9170e023695a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9e0330-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-09 14:43:23.017 2 DEBUG os_vif [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:f1,bridge_name='br-int',has_traffic_filtering=True,id=fb9e0330-dc2b-4a7f-ac53-9170e023695a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9e0330-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76 2025-10-09 14:43:23.235 2 DEBUG ovsdbapp.backend.ovs_idl [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-09 14:43:23.236 2 DEBUG ovsdbapp.backend.ovs_idl [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-09 14:43:23.236 2 DEBUG ovsdbapp.backend.ovs_idl [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-09 14:43:23.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:43:23.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [POLLOUT] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:43:23.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.267 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:43:23.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 2025-10-09 14:43:23.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '61d491a8-2883-591f-b089-8782b89052c9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:43:23.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:23.274 2 INFO oslo.privsep.daemon [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/usr/share/nova/nova-dist.conf', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpi1hrlyfk/privsep.sock'] 2025-10-09 14:43:24.156 2 INFO oslo.privsep.daemon [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Spawned new privsep daemon via rootwrap 2025-10-09 14:43:24.030 962 INFO oslo.privsep.daemon [-] privsep daemon starting 2025-10-09 14:43:24.033 962 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2025-10-09 14:43:24.038 962 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN/CAP_NET_ADMIN/none 2025-10-09 14:43:24.038 962 INFO oslo.privsep.daemon [-] privsep daemon running as pid 962 2025-10-09 14:43:24.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.533 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(bridge=br-int, port=tapfb9e0330-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:43:24.534 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(table=Port, record=tapfb9e0330-dc, col_values=(('qos', UUID('f1114d74-26d9-4dbf-a6ee-abaf7c4dd41e')),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:43:24.537 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(table=Interface, record=tapfb9e0330-dc, col_values=(('external_ids', {'iface-id': 'fb9e0330-dc2b-4a7f-ac53-9170e023695a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:ac:f1', 'vm-uuid': 'f9af0e26-e2a2-439e-9de7-367991eb09d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:43:24.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:43:24.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.592 2 INFO os_vif [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:ac:f1,bridge_name='br-int',has_traffic_filtering=True,id=fb9e0330-dc2b-4a7f-ac53-9170e023695a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb9e0330-dc') 2025-10-09 14:43:24.642 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:43:24.643 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:43:24.643 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No VIF found with MAC fa:16:3e:10:ac:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-09 14:43:24.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.968 2 DEBUG nova.compute.manager [req-d3f23b6a-adec-491e-a24f-6c3b64aab870 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Received event network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:43:24.969 2 DEBUG oslo_concurrency.lockutils [req-d3f23b6a-adec-491e-a24f-6c3b64aab870 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:24.969 2 DEBUG oslo_concurrency.lockutils [req-d3f23b6a-adec-491e-a24f-6c3b64aab870 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:24.970 2 DEBUG nova.compute.manager [req-d3f23b6a-adec-491e-a24f-6c3b64aab870 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Processing event network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10449 2025-10-09 14:43:24.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:24.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:25.641 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:43:25.642 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] VM Started (Lifecycle Event) 2025-10-09 14:43:25.686 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4290 2025-10-09 14:43:25.692 2 INFO nova.virt.libvirt.driver [-] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Instance spawned successfully. 2025-10-09 14:43:25.692 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:852 2025-10-09 14:43:25.697 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:43:25.702 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-09 14:43:25.712 2 DEBUG nova.block_device [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-09 14:43:25.718 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.719 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.719 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.720 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.721 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.722 2 DEBUG nova.virt.libvirt.driver [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:43:25.756 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-09 14:43:25.757 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:43:25.757 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] VM Paused (Lifecycle Event) 2025-10-09 14:43:25.794 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:43:25.798 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:43:25.798 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] VM Resumed (Lifecycle Event) 2025-10-09 14:43:25.834 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:43:25.846 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-09 14:43:25.854 2 INFO nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Took 8.67 seconds to spawn the instance on the hypervisor. 2025-10-09 14:43:25.855 2 DEBUG nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:43:25.920 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-09 14:43:25.940 2 INFO nova.compute.manager [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Took 9.62 seconds to build instance. 2025-10-09 14:43:25.968 2 DEBUG oslo_concurrency.lockutils [req-9cf73e51-889d-4289-aab4-e56248d69b85 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" released by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:26.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:27.040 2 DEBUG nova.compute.manager [req-996ff1c5-8891-4523-ad81-3801c58088be e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Received event network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:43:27.040 2 DEBUG oslo_concurrency.lockutils [req-996ff1c5-8891-4523-ad81-3801c58088be e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:27.041 2 DEBUG oslo_concurrency.lockutils [req-996ff1c5-8891-4523-ad81-3801c58088be e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:27.042 2 DEBUG nova.compute.manager [req-996ff1c5-8891-4523-ad81-3801c58088be e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] No waiting events found dispatching network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:319 2025-10-09 14:43:27.043 2 WARNING nova.compute.manager [req-996ff1c5-8891-4523-ad81-3801c58088be e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Received unexpected event network-vif-plugged-fb9e0330-dc2b-4a7f-ac53-9170e023695a for instance with vm_state active and task_state None. 2025-10-09 14:43:29.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:31.029 2 DEBUG nova.compute.manager [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Received event network-changed-fb9e0330-dc2b-4a7f-ac53-9170e023695a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:43:31.030 2 DEBUG nova.compute.manager [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Refreshing instance network info cache due to event network-changed-fb9e0330-dc2b-4a7f-ac53-9170e023695a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-09 14:43:31.030 2 DEBUG oslo_concurrency.lockutils [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:43:31.031 2 DEBUG nova.network.neutron [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Refreshing network info cache for port fb9e0330-dc2b-4a7f-ac53-9170e023695a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-09 14:43:31.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:31.884 2 DEBUG nova.network.neutron [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated VIF entry in instance network info cache for port fb9e0330-dc2b-4a7f-ac53-9170e023695a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-09 14:43:31.884 2 DEBUG nova.network.neutron [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:43:31.903 2 DEBUG oslo_concurrency.lockutils [req-e558ba69-f0cd-4431-89a2-95a0ec961973 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:43:34.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:36.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:39.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:41.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:44.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:45.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:45.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:43:46.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:48.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:48.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:49.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:50.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:50.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:43:50.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:43:50.852 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:43:50.852 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:43:50.853 2 DEBUG nova.objects.instance [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lazy-loading 'info_cache' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:43:51.336 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:43:51.349 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:43:51.350 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:43:51.350 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:51.351 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:51.367 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:51.368 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:51.368 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:43:51.369 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:51.835 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:51.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:51.910 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:43:51.911 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:43:52.167 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:43:52.169 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4038MB free_disk=6.9506683349609375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:43:52.169 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:43:52.240 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:43:52.241 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:43:52.241 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=6GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:43:52.268 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:43:52.714 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:43:52.719 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:43:52.731 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:43:52.731 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:43:52.731 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:43:53.103 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:53.104 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:53.119 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:43:54.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:56.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:43:59.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:01.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:04.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:06.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:09.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:11.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:13.495 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:13.512 2 DEBUG nova.block_device [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-09 14:44:13.512 2 DEBUG nova.objects.instance [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lazy-loading 'flavor' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:44:13.561 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" released by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:13.686 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:13.687 2 INFO nova.compute.manager [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Attaching volume 950d6db9-e4a2-47c9-a85a-b7c7148d06ce to /dev/vdc 2025-10-09 14:44:13.752 2 DEBUG os_brick.utils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '172.17.0.100', 'multipath': False, 'enforce_multipath': True, 'host': 'standalone.localdomain', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:163 2025-10-09 14:44:13.754 2 INFO oslo.privsep.daemon [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/usr/share/nova/nova-dist.conf', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp720dm6uy/privsep.sock'] 2025-10-09 14:44:14.526 2 INFO oslo.privsep.daemon [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Spawned new privsep daemon via rootwrap 2025-10-09 14:44:14.389 1022 INFO oslo.privsep.daemon [-] privsep daemon starting 2025-10-09 14:44:14.394 1022 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2025-10-09 14:44:14.398 1022 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none 2025-10-09 14:44:14.398 1022 INFO oslo.privsep.daemon [-] privsep daemon running as pid 1022 2025-10-09 14:44:14.529 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418455479392]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:14.644 1022 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:14.653 1022 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:14.654 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418455479392]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6d3ffa69b9a\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:14.655 2 WARNING os_brick.initiator.connectors.nvmeof [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' 2025-10-09 14:44:14.656 1022 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:14.665 1022 DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:14.666 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418455479392]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:14.668 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418455479392]: (4, 'da7dd545-e3fc-420a-b1b5-9235ad14ded1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:14.668 2 DEBUG oslo_concurrency.processutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:14.690 2 DEBUG oslo_concurrency.processutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:14.692 2 DEBUG os_brick.utils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] <== get_connector_properties: return (939ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '172.17.0.100', 'host': 'standalone.localdomain', 'multipath': False, 'initiator': 'iqn.1994-05.com.redhat:e6d3ffa69b9a', 'do_local_attach': False, 'system uuid': 'da7dd545-e3fc-420a-b1b5-9235ad14ded1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:33069693-bb8b-4414-95bc-37b06483388d', 'nvme_native_multipath': False} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:190 2025-10-09 14:44:14.693 2 DEBUG nova.virt.block_device [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating existing volume attachment record: dc49f703-50a9-4a8b-a7c6-f152db817e2a _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:569 2025-10-09 14:44:14.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:15.514 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:15.517 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "cache_volume_driver" released by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:15.523 2 DEBUG nova.objects.instance [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lazy-loading 'flavor' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:44:15.571 2 DEBUG nova.virt.libvirt.driver [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Attempting to attach volume 950d6db9-e4a2-47c9-a85a-b7c7148d06ce with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2108 2025-10-09 14:44:15.574 2 DEBUG nova.virt.libvirt.guest [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] attach device xml: 950d6db9-e4a2-47c9-a85a-b7c7148d06ce attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:320 2025-10-09 14:44:15.666 2 DEBUG nova.virt.libvirt.driver [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:44:15.667 2 DEBUG nova.virt.libvirt.driver [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:44:15.667 2 DEBUG nova.virt.libvirt.driver [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:44:15.668 2 DEBUG nova.virt.libvirt.driver [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No VIF found with MAC fa:16:3e:10:ac:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-09 14:44:15.846 2 DEBUG oslo_concurrency.lockutils [req-f9538546-b758-474a-a063-eec0a41a02bd edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" released by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 2.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:16.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:19.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:21.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:24.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:26.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:28.155 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:28.168 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2306 2025-10-09 14:44:28.236 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:28.240 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2200 2025-10-09 14:44:28.241 2 INFO nova.compute.claims [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Claim successful on node standalone.localdomain 2025-10-09 14:44:28.319 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:28.740 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:28.745 2 DEBUG nova.compute.provider_tree [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:44:28.756 2 DEBUG nova.scheduler.client.report [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:44:28.756 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:28.757 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2679 2025-10-09 14:44:28.808 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1845 2025-10-09 14:44:28.809 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1020 2025-10-09 14:44:28.820 2 INFO nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names 2025-10-09 14:44:28.846 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2714 2025-10-09 14:44:28.895 2 INFO nova.virt.block_device [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Booting with volume f7433a0a-f68e-46b4-884d-f15914f2e8c9 at /dev/vda 2025-10-09 14:44:28.958 2 DEBUG os_brick.utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '172.17.0.100', 'multipath': False, 'enforce_multipath': True, 'host': 'standalone.localdomain', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:163 2025-10-09 14:44:28.960 1022 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:28.969 1022 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:28.969 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418588965168]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6d3ffa69b9a\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:28.970 1022 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:28.979 1022 DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:28.979 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418588965168]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:28.981 1022 DEBUG oslo.privsep.daemon [-] privsep: reply[140418588965168]: (4, 'da7dd545-e3fc-420a-b1b5-9235ad14ded1') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-09 14:44:28.982 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:28.995 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "nvme version" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:28.996 2 DEBUG os_brick.utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] <== get_connector_properties: return (36ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '172.17.0.100', 'host': 'standalone.localdomain', 'multipath': False, 'initiator': 'iqn.1994-05.com.redhat:e6d3ffa69b9a', 'do_local_attach': False, 'system uuid': 'da7dd545-e3fc-420a-b1b5-9235ad14ded1', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:33069693-bb8b-4414-95bc-37b06483388d', 'nvme_native_multipath': False} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:190 2025-10-09 14:44:28.996 2 DEBUG nova.virt.block_device [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating existing volume attachment record: 66e373bb-713f-4216-8788-d515a06bf183 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:569 2025-10-09 14:44:29.552 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Successfully created port: d290d4af-922f-473e-8928-23571339137a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:549 2025-10-09 14:44:29.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:29.970 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2510 2025-10-09 14:44:29.973 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4592 2025-10-09 14:44:29.974 2 INFO nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Creating image 2025-10-09 14:44:30.017 2 DEBUG nova.storage.rbd_utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:44:30.049 2 DEBUG nova.storage.rbd_utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:44:30.092 2 DEBUG nova.storage.rbd_utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:44:30.106 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:30.168 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:30.170 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:30.171 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "ephemeral_1_0706d66" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:30.208 2 DEBUG nova.storage.rbd_utils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] rbd image 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-09 14:44:30.213 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:30.437 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Successfully updated port: d290d4af-922f-473e-8928-23571339137a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:587 2025-10-09 14:44:30.463 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Acquired lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:44:30.463 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1872 2025-10-09 14:44:30.493 2 DEBUG nova.compute.manager [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Received event network-changed-d290d4af-922f-473e-8928-23571339137a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:44:30.494 2 DEBUG nova.compute.manager [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Refreshing instance network info cache due to event network-changed-d290d4af-922f-473e-8928-23571339137a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-09 14:44:30.542 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3049 2025-10-09 14:44:30.807 2 DEBUG nova.network.neutron [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating instance_info_cache with network_info: [{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:44:30.829 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Releasing lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:44:30.830 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Instance network_info: |[{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1859 2025-10-09 14:44:30.830 2 DEBUG oslo_concurrency.lockutils [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Acquired lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:44:30.831 2 DEBUG nova.network.neutron [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Refreshing network info cache for port d290d4af-922f-473e-8928-23571339137a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-09 14:44:30.940 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 45dc88d8-d6b0-4a8b-80b8-07b922fcb467_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.727s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:31.042 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719 2025-10-09 14:44:31.043 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Ensure instance console log exists: /var/lib/nova/instances/45dc88d8-d6b0-4a8b-80b8-07b922fcb467/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4482 2025-10-09 14:44:31.044 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:31.044 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "vgpu_resources" released by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:31.049 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Start _get_guest_xml network_info=[{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'ephemerals': [{'device_name': '/dev/vdb', 'device_type': 'disk', 'disk_bus': 'virtio', 'size': 1, 'guest_format': None}], 'block_device_mapping': [{'boot_index': 0, 'device_type': 'disk', 'delete_on_termination': False, 'disk_bus': 'virtio', 'attachment_id': '66e373bb-713f-4216-8788-d515a06bf183', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-f7433a0a-f68e-46b4-884d-f15914f2e8c9', 'hosts': ['172.18.0.100'], 'ports': ['6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'f7433a0a-f68e-46b4-884d-f15914f2e8c9', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '45dc88d8-d6b0-4a8b-80b8-07b922fcb467', 'attached_at': '', 'detached_at': '', 'volume_id': 'f7433a0a-f68e-46b4-884d-f15914f2e8c9', 'serial': 'f7433a0a-f68e-46b4-884d-f15914f2e8c9'}, 'guest_format': None, 'mount_device': '/dev/vda', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7177 2025-10-09 14:44:31.054 2 WARNING nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:44:31.056 2 DEBUG nova.virt.libvirt.host [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1463 2025-10-09 14:44:31.058 2 DEBUG nova.virt.libvirt.host [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1473 2025-10-09 14:44:31.060 2 DEBUG nova.virt.libvirt.host [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1482 2025-10-09 14:44:31.060 2 DEBUG nova.virt.libvirt.host [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1489 2025-10-09 14:44:31.061 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-09 14:44:31.061 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-09T14:42:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='dfc79ac3-5f0f-42f3-94a9-2ab50a6911df',id=3,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:558 2025-10-09 14:44:31.062 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:343 2025-10-09 14:44:31.062 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:347 2025-10-09 14:44:31.062 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:383 2025-10-09 14:44:31.063 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:387 2025-10-09 14:44:31.063 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:425 2025-10-09 14:44:31.063 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:564 2025-10-09 14:44:31.064 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:466 2025-10-09 14:44:31.064 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:496 2025-10-09 14:44:31.064 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:570 2025-10-09 14:44:31.065 2 DEBUG nova.virt.hardware [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:572 2025-10-09 14:44:31.066 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:31.311 2 DEBUG nova.network.neutron [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updated VIF entry in instance network info cache for port d290d4af-922f-473e-8928-23571339137a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-09 14:44:31.312 2 DEBUG nova.network.neutron [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating instance_info_cache with network_info: [{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:44:31.328 2 DEBUG oslo_concurrency.lockutils [req-daa47b9c-01d8-47a1-b88d-b2a9cf073fc4 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Releasing lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:44:31.522 2 DEBUG oslo_concurrency.processutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:31.552 2 DEBUG nova.virt.libvirt.vif [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-09T14:44:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='bfv-server',display_name='bfv-server',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='bfv-server',id=2,image_ref='',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c66b981ab228418cac8c3cbc73ada639',ramdisk_id='',reservation_id='r-9gfxqt7u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',image_signature_verified='False',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-09T14:44:28Z,user_data=None,user_id='edbfdb44c4234cc2878badaa44d34c1e',uuid=45dc88d8-d6b0-4a8b-80b8-07b922fcb467,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:566 2025-10-09 14:44:31.553 2 DEBUG nova.network.os_vif_util [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converting VIF {"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-09 14:44:31.555 2 DEBUG nova.network.os_vif_util [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:da:7d,bridge_name='br-int',has_traffic_filtering=True,id=d290d4af-922f-473e-8928-23571339137a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd290d4af-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-09 14:44:31.556 2 DEBUG nova.objects.instance [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lazy-loading 'pci_devices' on Instance uuid 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:44:31.571 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] End _get_guest_xml xml= 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 instance-00000002 524288 1 bfv-server 2025-10-09 14:44:31 512 1 0 1 1 admin admin Red Hat OpenStack Compute 23.2.3-17.1.20250522071028.2ace99d.el9ost 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 Virtual Machine hvm f7433a0a-f68e-46b4-884d-f15914f2e8c9 /dev/urandom _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7183 2025-10-09 14:44:31.572 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Preparing to wait for external event network-vif-plugged-d290d4af-922f-473e-8928-23571339137a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:282 2025-10-09 14:44:31.572 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:31.573 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" released by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:31.574 2 DEBUG nova.virt.libvirt.vif [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-09T14:44:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='bfv-server',display_name='bfv-server',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='bfv-server',id=2,image_ref='',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c66b981ab228418cac8c3cbc73ada639',ramdisk_id='',reservation_id='r-9gfxqt7u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',image_signature_verified='False',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-09T14:44:28Z,user_data=None,user_id='edbfdb44c4234cc2878badaa44d34c1e',uuid=45dc88d8-d6b0-4a8b-80b8-07b922fcb467,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:712 2025-10-09 14:44:31.574 2 DEBUG nova.network.os_vif_util [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converting VIF {"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-09 14:44:31.577 2 DEBUG nova.network.os_vif_util [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:da:7d,bridge_name='br-int',has_traffic_filtering=True,id=d290d4af-922f-473e-8928-23571339137a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd290d4af-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-09 14:44:31.577 2 DEBUG os_vif [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:da:7d,bridge_name='br-int',has_traffic_filtering=True,id=d290d4af-922f-473e-8928-23571339137a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd290d4af-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76 2025-10-09 14:44:31.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.579 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:44:31.579 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 2025-10-09 14:44:31.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '90be018a-b427-56b1-b5a5-4193d8c24055', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:44:31.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:44:31.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(bridge=br-int, port=tapd290d4af-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:44:31.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(table=Port, record=tapd290d4af-92, col_values=(('qos', UUID('4b2c10de-ed22-4318-b799-5c1df13be37d')),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:44:31.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(table=Interface, record=tapd290d4af-92, col_values=(('external_ids', {'iface-id': 'd290d4af-922f-473e-8928-23571339137a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:da:7d', 'vm-uuid': '45dc88d8-d6b0-4a8b-80b8-07b922fcb467'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-09 14:44:31.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:44:31.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.652 2 INFO os_vif [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:da:7d,bridge_name='br-int',has_traffic_filtering=True,id=d290d4af-922f-473e-8928-23571339137a,network=Network(32a6ee04-a550-43a4-bfdf-03b8e6b3dac4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd290d4af-92') 2025-10-09 14:44:31.695 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:44:31.696 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-09 14:44:31.696 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] No VIF found with MAC fa:16:3e:56:da:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-09 14:44:31.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:31.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:32.424 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:44:32.424 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] VM Started (Lifecycle Event) 2025-10-09 14:44:32.475 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:44:32.480 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:44:32.480 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] VM Paused (Lifecycle Event) 2025-10-09 14:44:32.529 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:44:32.536 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-09 14:44:32.562 2 DEBUG nova.compute.manager [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Received event network-vif-plugged-d290d4af-922f-473e-8928-23571339137a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:44:32.563 2 DEBUG oslo_concurrency.lockutils [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:32.563 2 DEBUG oslo_concurrency.lockutils [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:32.564 2 DEBUG nova.compute.manager [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Processing event network-vif-plugged-d290d4af-922f-473e-8928-23571339137a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10449 2025-10-09 14:44:32.564 2 DEBUG nova.compute.manager [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Received event network-vif-plugged-d290d4af-922f-473e-8928-23571339137a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-09 14:44:32.565 2 DEBUG oslo_concurrency.lockutils [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:32.565 2 DEBUG oslo_concurrency.lockutils [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:32.565 2 DEBUG nova.compute.manager [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] No waiting events found dispatching network-vif-plugged-d290d4af-922f-473e-8928-23571339137a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:319 2025-10-09 14:44:32.566 2 WARNING nova.compute.manager [req-99e7d35a-c5ca-4aa2-b344-d41d7f00cae9 e519e942d95442278562f8902499cf11 999a51e1ca114e11bcba401d9c4138e2 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Received unexpected event network-vif-plugged-d290d4af-922f-473e-8928-23571339137a for instance with vm_state building and task_state spawning. 2025-10-09 14:44:32.572 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4290 2025-10-09 14:44:32.576 2 INFO nova.virt.libvirt.driver [-] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Instance spawned successfully. 2025-10-09 14:44:32.577 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:852 2025-10-09 14:44:32.605 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-09 14:44:32.605 2 DEBUG nova.virt.driver [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-09 14:44:32.606 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] VM Resumed (Lifecycle Event) 2025-10-09 14:44:32.649 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.650 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.650 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.651 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.651 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.652 2 DEBUG nova.virt.libvirt.driver [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-09 14:44:32.658 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:44:32.663 2 DEBUG nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-09 14:44:32.707 2 INFO nova.compute.manager [req-c7ffa774-aa3c-4a1b-9f61-d165e7d5ae11 - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-09 14:44:32.743 2 INFO nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Took 2.77 seconds to spawn the instance on the hypervisor. 2025-10-09 14:44:32.743 2 DEBUG nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-09 14:44:32.813 2 INFO nova.compute.manager [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Took 4.59 seconds to build instance. 2025-10-09 14:44:32.837 2 DEBUG oslo_concurrency.lockutils [req-223239d9-151d-4df4-bed2-4d2b9c0dca50 edbfdb44c4234cc2878badaa44d34c1e c66b981ab228418cac8c3cbc73ada639 - default default] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467" released by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:36.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:36.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:41.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:41.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:46.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:46.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:44:46.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:46.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:48.642 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:48.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:49.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:49.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:49.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 14:44:50.736 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:50.736 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 14:44:50.754 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-09 14:44:51.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:51.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:44:51.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:44:51.817 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:44:51.818 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:44:51.818 2 DEBUG nova.objects.instance [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lazy-loading 'info_cache' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:44:51.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:51.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:52.376 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:44:52.392 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:44:52.393 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:44:52.393 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:52.393 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:52.393 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:52.412 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:52.413 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:52.413 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:44:52.413 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:52.912 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:52.967 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:44:52.967 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:44:52.967 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:44:52.971 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:44:52.971 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:44:53.220 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:44:53.222 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3668MB free_disk=6.949649810791016GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:44:53.222 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:44:53.343 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:44:53.343 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:44:53.344 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:44:53.344 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:44:53.393 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing inventories for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804 2025-10-09 14:44:53.414 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Updating ProviderTree inventory for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768 2025-10-09 14:44:53.415 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Updating inventory in ProviderTree for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-09 14:44:53.427 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing aggregate associations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813 2025-10-09 14:44:53.445 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Refreshing trait associations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a, traits: COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_BMI2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_NODE,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AMD_SVM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_F16C _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825 2025-10-09 14:44:53.446 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:44:53.939 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:44:53.946 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:44:53.964 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:44:53.965 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:44:53.965 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:44:53.966 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:54.306 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:54.306 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:44:56.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:44:56.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:01.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:01.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:06.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:06.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:06.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:06.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:06.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:06.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:11.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:11.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:11.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:11.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:11.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:11.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:16.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:16.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:16.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:16.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:17.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:17.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:22.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:22.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:22.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:22.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:22.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:22.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:24.240 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:24.262 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Triggering sync for uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:9924 2025-10-09 14:45:24.262 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Triggering sync for uuid 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:9924 2025-10-09 14:45:24.263 2 DEBUG oslo_concurrency.lockutils [-] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:45:24.263 2 DEBUG oslo_concurrency.lockutils [-] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:45:24.325 2 DEBUG oslo_concurrency.lockutils [-] Lock "45dc88d8-d6b0-4a8b-80b8-07b922fcb467" released by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:45:24.327 2 DEBUG oslo_concurrency.lockutils [-] Lock "f9af0e26-e2a2-439e-9de7-367991eb09d8" released by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:45:27.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:27.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:27.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:27.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:27.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:27.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:32.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:32.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:32.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:32.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:32.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:32.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:37.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:37.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:37.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:37.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:37.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:37.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:42.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:42.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:42.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:42.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:42.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:42.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:47.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:45:47.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:45:47.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:47.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:47.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:45:47.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:47.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:45:50.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:51.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:51.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:45:51.861 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:45:51.862 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:45:52.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:45:52.417 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating instance_info_cache with network_info: [{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:45:52.433 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:45:52.434 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:45:52.434 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:52.435 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:52.435 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:52.450 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:45:52.451 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:45:52.451 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:45:52.452 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:45:52.879 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:45:52.937 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:45:52.937 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:45:52.937 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:45:52.940 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:45:52.940 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:45:53.096 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:45:53.097 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3760MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:45:53.097 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:45:53.142 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:45:53.143 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:45:53.143 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:45:53.143 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:45:53.144 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:45:53.599 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:45:53.609 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:45:53.621 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:45:53.622 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:45:53.622 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:45:53.910 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:53.910 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:54.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:45:57.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:02.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:46:07.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:12.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:17.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:22.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:46:27.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:46:32.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:37.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:42.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:46:42.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:42.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:46:42.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:46:42.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:46:42.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:47.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:46:48.657 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:48.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:48.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:46:50.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:51.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:51.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:46:51.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:46:51.809 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:46:51.809 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:46:51.810 2 DEBUG nova.objects.instance [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lazy-loading 'info_cache' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:46:52.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:46:52.378 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:46:52.395 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:46:52.395 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:46:52.395 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:52.414 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:46:52.415 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:46:52.415 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:46:52.415 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:46:52.876 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:46:52.933 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:46:52.934 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:46:52.934 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:46:52.938 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:46:52.938 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:46:53.178 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:46:53.179 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3757MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:46:53.180 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:46:53.247 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:46:53.248 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:46:53.248 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:46:53.248 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:46:53.252 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:46:53.677 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:46:53.682 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:46:53.696 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:46:53.696 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:46:53.697 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:46:54.023 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:54.024 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:54.025 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:54.025 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:56.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:46:57.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:02.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:07.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:07.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:12.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:12.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:17.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:17.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:22.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:22.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:27.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:47:27.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:27.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:47:27.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:47:27.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:47:27.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:32.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:37.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:42.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:47.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:50.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:50.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:50.724 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:47:51.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:51.724 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:47:51.881 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:47:51.881 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:47:52.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:47:52.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:47:52.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:47:52.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:47:52.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:47:52.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:52.377 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating instance_info_cache with network_info: [{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:47:52.392 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:47:52.393 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:47:52.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:52.743 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:47:52.744 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:47:52.744 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:47:52.745 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:47:53.212 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:47:53.284 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:47:53.284 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:47:53.285 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:47:53.289 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:47:53.289 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:47:53.559 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:47:53.561 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3862MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:47:53.562 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:47:53.630 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:47:53.631 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:47:53.631 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:47:53.631 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:47:53.634 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:47:54.051 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:47:54.058 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:47:54.077 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:47:54.078 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:47:54.078 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:47:54.998 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:54.999 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:55.000 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:55.000 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:47:57.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:47:57.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:02.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:07.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:07.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:12.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:12.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:17.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:22.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:27.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:27.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:27.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:48:27.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:48:27.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:27.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:48:32.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:32.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:37.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:42.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:47.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:47.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:49.657 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:50.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:50.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:48:51.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:52.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:48:52.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:52.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:48:52.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:48:52.927 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:48:52.928 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:48:52.928 2 DEBUG nova.objects.instance [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lazy-loading 'info_cache' on Instance uuid f9af0e26-e2a2-439e-9de7-367991eb09d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-09 14:48:53.399 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updating instance_info_cache with network_info: [{"id": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "address": "fa:16:3e:10:ac:f1", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapfb9e0330-dc", "ovs_interfaceid": "fb9e0330-dc2b-4a7f-ac53-9170e023695a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:48:53.414 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-f9af0e26-e2a2-439e-9de7-367991eb09d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:48:53.414 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:48:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:53.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:54.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:54.745 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:48:54.745 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:48:54.746 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:48:54.746 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:48:55.201 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:48:55.291 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:48:55.292 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:48:55.292 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:48:55.297 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:48:55.298 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:48:55.540 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:48:55.541 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=5187MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:48:55.541 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:48:55.616 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance f9af0e26-e2a2-439e-9de7-367991eb09d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:48:55.617 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Instance 45dc88d8-d6b0-4a8b-80b8-07b922fcb467 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-09 14:48:55.617 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:48:55.617 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:48:55.619 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:48:56.105 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:48:56.110 2 DEBUG nova.compute.provider_tree [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed in ProviderTree for provider: 019ad6c7-b66d-4a42-8fad-340daa8e4d4a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-09 14:48:56.123 2 DEBUG nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Inventory has not changed for provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-09 14:48:56.123 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-09 14:48:56.124 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:48:57.123 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:57.125 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:48:57.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:48:59.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:02.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:07.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:07.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:12.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:17.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:17.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:17.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:49:17.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:17.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:17.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:22.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:27.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:32.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:32.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:32.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:49:32.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:32.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:32.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:37.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:42.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:42.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:47.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:51.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:51.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:51.724 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:49:52.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:52.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:52.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:49:52.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:52.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:49:52.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:49:52.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:52.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:49:52.868 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Acquired lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-09 14:49:52.868 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-09 14:49:53.404 2 DEBUG nova.network.neutron [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updating instance_info_cache with network_info: [{"id": "d290d4af-922f-473e-8928-23571339137a", "address": "fa:16:3e:56:da:7d", "network": {"id": "32a6ee04-a550-43a4-bfdf-03b8e6b3dac4", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.158", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c66b981ab228418cac8c3cbc73ada639", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd290d4af-92", "ovs_interfaceid": "d290d4af-922f-473e-8928-23571339137a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-09 14:49:53.420 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Releasing lock "refresh_cache-45dc88d8-d6b0-4a8b-80b8-07b922fcb467" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-09 14:49:53.420 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: 45dc88d8-d6b0-4a8b-80b8-07b922fcb467] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-09 14:49:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:54.730 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:54.749 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:49:54.750 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:49:54.750 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:49:54.750 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:49:55.228 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:49:55.305 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:49:55.306 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:49:55.306 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:49:55.311 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:49:55.311 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:49:55.591 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:49:55.593 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=6952MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:49:55.593 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:49:55.680 2 ERROR nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} 2025-10-09 14:49:55.681 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:49:55.682 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:49:55.697 2 ERROR nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [req-ea2d0662-3510-49c6-b20b-3c8fc7dcfa36] Failed to retrieve resource provider tree from placement API for UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}. 2025-10-09 14:49:55.697 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error updating resources for node standalone.localdomain.: nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager Traceback (most recent call last): 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10167, in _update_available_resource_for_node 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename, 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 886, in update_available_resource 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 360, in inner 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager return f(*args, **kwargs) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 993, in _update_available_resource 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager self._update(context, cn, startup=startup) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1244, in _update 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager return attempt.get(self._wrap_exception) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2]) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager raise value 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1174, in _update_to_placement 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager prov_tree = self.reportclient.get_provider_tree_and_ensure_root( 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 899, in get_provider_tree_and_ensure_root 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager self._ensure_resource_provider( 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 688, in _ensure_resource_provider 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager rps_to_refresh = self.get_providers_in_tree(context, uuid) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 551, in get_providers_in_tree 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid) 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:49:55.697 2 ERROR nova.compute.manager 2025-10-09 14:49:55.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:55.723 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:49:55.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 14:49:57.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:57.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:49:57.733 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:01.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:02.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:02.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:02.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:02.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:02.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:02.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:02.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:02.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 14:50:02.735 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-09 14:50:07.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:07.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:07.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:07.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:07.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:07.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:12.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:17.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:17.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:17.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:17.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:17.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:22.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:22.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:22.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:22.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:22.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:22.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:27.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:27.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:27.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5009 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:27.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:27.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:27.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:32.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:32.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:32.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:32.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:32.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:32.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:37.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:37.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:37.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:37.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:37.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:37.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:42.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:42.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:42.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:42.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:42.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:42.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:47.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:47.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:47.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:47.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:47.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:52.736 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:52.736 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:52.737 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:50:52.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:52.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:52.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:50:52.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:52.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:50:52.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:50:53.643 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:53.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:53.721 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:50:53.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 14:50:57.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:50:59.151 2 DEBUG neutronclient.v2_0.client [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error message:

502 Bad Gateway

The server returned an invalid or incomplete response. _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:258 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] An error occurred while refreshing the network cache.: neutronclient.common.exceptions.NeutronClientException:

502 Bad Gateway

The server returned an invalid or incomplete response. 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] Traceback (most recent call last): 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9577, in _heal_instance_info_cache 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] self._require_nw_info_update(context, instance): 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9484, in _require_nw_info_update 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ports = self.network_api.list_ports(context, **search_opts) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 1806, in list_ports 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] return get_client(context).list_ports(**search_opts) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 799, in list_ports 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] return self.list('ports', self.ports_path, retrieve_all, 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 368, in list 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] for r in self._pagination(collection, path, **params): 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 383, in _pagination 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] res = self.get(path, params=params) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 352, in get 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] return self.retry_request("GET", action, body=body, 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 329, in retry_request 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] return self.do_request(method, action, body=body, 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 293, in do_request 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] self._handle_fault_response(status_code, replybody, resp) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/nova/network/neutron.py", line 182, in wrapper 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] ret = obj(*args, **kwargs) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 268, in _handle_fault_response 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] exception_handler_v20(status_code, error_body) 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] File "/usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] raise client_exc(message=error_message, 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] neutronclient.common.exceptions.NeutronClientException:

502 Bad Gateway

2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] The server returned an invalid or incomplete response. 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] 2025-10-09 14:50:59.152 2 ERROR nova.compute.manager [instance: f9af0e26-e2a2-439e-9de7-367991eb09d8] 2025-10-09 14:50:59.160 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:59.160 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:59.160 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:59.161 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:50:59.181 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:50:59.181 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:50:59.182 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-09 14:50:59.182 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-09 14:50:59.650 2 DEBUG oslo_concurrency.processutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-09 14:50:59.738 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:50:59.739 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:50:59.739 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:50:59.742 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:50:59.742 2 DEBUG nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-09 14:51:00.001 2 WARNING nova.virt.libvirt.driver [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-09 14:51:00.002 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=8906MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-09 14:51:00.002 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-09 14:51:00.068 2 ERROR nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 019ad6c7-b66d-4a42-8fad-340daa8e4d4a: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} 2025-10-09 14:51:00.069 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-09 14:51:00.069 2 DEBUG nova.compute.resource_tracker [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-09 14:51:00.081 2 ERROR nova.scheduler.client.report [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] [req-985f0c7c-560d-4232-a158-7f90a80d0d48] Failed to retrieve resource provider tree from placement API for UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}. 2025-10-09 14:51:00.081 2 DEBUG oslo_concurrency.lockutils [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error updating resources for node standalone.localdomain.: nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager Traceback (most recent call last): 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10167, in _update_available_resource_for_node 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename, 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 886, in update_available_resource 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 360, in inner 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager return f(*args, **kwargs) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 993, in _update_available_resource 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager self._update(context, cn, startup=startup) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1244, in _update 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager return attempt.get(self._wrap_exception) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2]) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager raise value 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1174, in _update_to_placement 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager prov_tree = self.reportclient.get_provider_tree_and_ensure_root( 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 899, in get_provider_tree_and_ensure_root 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager self._ensure_resource_provider( 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 688, in _ensure_resource_provider 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager rps_to_refresh = self.get_providers_in_tree(context, uuid) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 551, in get_providers_in_tree 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid) 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 019ad6c7-b66d-4a42-8fad-340daa8e4d4a 2025-10-09 14:51:00.082 2 ERROR nova.compute.manager 2025-10-09 14:51:01.003 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:01.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:02.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:02.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:02.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:02.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:02.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:02.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:07.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:07.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:07.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:07.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:07.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:07.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:12.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:17.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:17.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:17.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:17.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:17.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:17.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:22.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:22.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:22.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:22.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:22.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:22.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:27.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:27.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:27.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:27.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:28.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:28.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:33.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:33.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:33.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:33.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:33.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:33.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:38.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:43.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:43.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:43.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:43.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:43.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:43.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:48.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:48.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:48.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:48.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:48.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:48.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:52.737 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:53.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:53.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:51:53.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:51:53.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:53.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:51:53.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:51:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:53.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:53.723 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:51:54.644 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:54.722 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:51:58.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:03.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:03.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:03.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:03.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:08.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:08.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:08.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:08.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:08.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:08.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:13.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:13.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:13.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:13.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:13.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:13.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:18.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:20.362 2 WARNING nova.servicegroup.drivers.db [-] Lost connection to nova-conductor for reporting service status.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 41f33af3589f4714a7d5dd811e87ec3f 2025-10-09 14:52:20.367 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-09 14:52:23.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:23.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:23.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:23.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:23.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:23.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:28.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:28.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:28.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:28.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:28.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:28.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:33.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:33.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:33.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:33.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:33.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:33.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:38.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:38.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:38.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5050 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:38.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:38.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:38.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:43.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:43.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:43.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:43.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:43.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:43.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:48.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:48.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:48.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:48.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:48.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:48.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:53.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:53.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0ae80009ed4b457d9f0bee7aec7bf9e7 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0ae80009ed4b457d9f0bee7aec7bf9e7 2025-10-09 14:52:54.726 2 ERROR oslo_service.periodic_task 2025-10-09 14:52:55.733 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:52:58.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:58.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:52:58.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:52:58.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:52:58.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:52:58.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:53:03.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:08.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:13.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:13.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:18.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:18.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:20.371 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:53:23.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:23.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:28.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:28.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:33.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:33.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:38.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:53:43.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:53:43.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:53:43.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5009 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:53:43.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:53:43.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:43.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:53:48.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:48.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:53.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 54ead61b3a0148529aee48ee8c6e8c8b 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 54ead61b3a0148529aee48ee8c6e8c8b 2025-10-09 14:53:55.738 2 ERROR oslo_service.periodic_task 2025-10-09 14:53:55.740 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:53:55.740 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 14:53:58.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:03.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:08.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:13.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:18.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:18.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:20.375 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:54:23.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:23.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:23.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5015 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:23.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:23.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:23.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:28.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:28.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:28.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:28.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:28.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:28.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:33.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:38.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:38.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:38.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:38.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:38.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:38.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:43.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:43.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:43.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:43.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:43.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:43.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:48.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:48.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:48.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:53.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:53.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:53.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:53.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:53.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:53.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b1949a57fb7645c6b967ee33b744e47c 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9547, in _heal_instance_info_cache 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task inst = objects.Instance.get_by_uuid( 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b1949a57fb7645c6b967ee33b744e47c 2025-10-09 14:54:55.745 2 ERROR oslo_service.periodic_task 2025-10-09 14:54:55.747 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:54:55.747 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:54:55.748 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:54:55.748 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:54:58.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:58.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:54:58.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:54:58.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:54:58.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:54:58.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:03.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:08.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:08.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:08.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5018 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:55:08.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:08.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:08.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:13.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:13.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:13.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:55:13.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:13.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:13.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:18.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:18.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:20.379 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:55:23.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:23.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:28.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:28.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:33.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:33.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:38.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:38.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:43.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:43.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:43.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:55:43.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:43.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:43.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:48.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:53.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:53.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:53.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:55:53.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:53.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:53.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a0f3983a977b4644b5b916ed9e8d7bae 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a0f3983a977b4644b5b916ed9e8d7bae 2025-10-09 14:55:55.751 2 ERROR oslo_service.periodic_task 2025-10-09 14:55:55.754 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:55:55.755 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:55:58.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:55:58.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:55:58.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:55:58.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:58.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:55:58.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:03.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:08.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:08.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:08.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:56:08.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:08.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:08.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:13.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:13.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:13.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:56:13.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:13.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:13.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:18.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:20.384 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:56:23.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:24.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:24.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:56:24.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:24.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:24.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:28.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:29.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:34.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:39.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:44.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:44.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:56:44.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:56:44.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:44.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:44.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:56:49.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:54.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:54.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c4c7e31041dd400784676a96fbb896d5 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c4c7e31041dd400784676a96fbb896d5 2025-10-09 14:56:55.759 2 ERROR oslo_service.periodic_task 2025-10-09 14:56:55.761 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:56:55.761 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 14:56:55.762 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:56:59.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:56:59.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:04.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:04.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:09.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:09.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:09.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:09.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:09.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:09.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:14.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:14.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:14.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:14.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:14.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:14.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:19.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:20.388 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:57:24.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:24.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:24.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:24.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:24.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:24.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:29.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:29.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:29.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:29.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:29.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:29.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:34.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:39.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:39.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:39.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:39.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:39.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:39.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:44.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:44.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:44.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:44.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:44.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:44.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:49.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:49.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:49.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:49.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:49.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:49.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:54.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:54.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:54.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:54.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:54.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:54.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b85c147e0a6e408abf4beffb6d707307 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b85c147e0a6e408abf4beffb6d707307 2025-10-09 14:57:55.768 2 ERROR oslo_service.periodic_task 2025-10-09 14:57:55.770 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:57:55.771 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 14:57:59.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:59.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:57:59.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:57:59.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:57:59.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:57:59.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:04.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:04.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:04.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:04.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:04.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:04.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:09.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:09.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:09.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:09.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:09.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:09.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:14.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:14.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:19.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:19.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:19.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:19.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:19.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:19.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:20.393 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:58:24.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:29.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:34.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:34.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:39.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:39.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:39.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:39.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:39.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:39.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:44.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:44.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:44.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:44.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:44.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:44.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:49.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:49.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:49.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:49.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:49.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:49.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:54.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:54.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:54.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:54.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:54.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:54.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1df056c729b74390aae3c1259e864b15 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters( 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1df056c729b74390aae3c1259e864b15 2025-10-09 14:58:55.775 2 ERROR oslo_service.periodic_task 2025-10-09 14:58:55.776 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:58:55.776 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 14:58:59.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:59.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:58:59.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:58:59.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:58:59.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:58:59.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:04.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:04.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:04.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:04.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:04.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:04.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:09.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:09.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:09.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:09.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:09.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:09.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:14.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:19.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:19.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:20.397 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 14:59:24.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:24.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:24.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:24.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:24.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:24.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:29.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:29.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:29.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:29.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:29.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:29.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:34.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:39.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:44.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:44.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:44.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:44.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:44.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:44.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:49.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:49.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:54.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:54.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 14:59:54.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 14:59:54.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:54.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 14:59:54.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5517ef71b139457282e8d3ca7023013b 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context, 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5517ef71b139457282e8d3ca7023013b 2025-10-09 14:59:55.779 2 ERROR oslo_service.periodic_task 2025-10-09 14:59:55.781 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 14:59:59.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:04.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:04.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:04.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5008 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:04.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:04.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:04.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:09.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:09.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:09.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:09.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:10.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:10.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:15.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:15.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:15.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:15.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:15.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:15.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:20.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:20.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:20.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:20.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:20.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:20.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:20.401 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:00:25.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:25.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:25.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:25.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:25.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:25.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:30.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:30.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:00:30.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:00:30.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:00:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:35.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:40.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:50.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:50.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:55.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:55.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 65a98fbf4f764145b8cf5c13691002e4 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context) 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 65a98fbf4f764145b8cf5c13691002e4 2025-10-09 15:00:55.785 2 ERROR oslo_service.periodic_task 2025-10-09 15:00:56.695 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:00:56.696 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:01:00.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:00.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:05.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:05.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:10.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:10.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:10.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:01:10.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:10.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:10.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:15.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:15.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:01:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:15.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:20.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:20.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:20.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:01:20.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:20.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:20.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:01:20.405 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:01:25.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:30.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:35.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:35.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:40.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:01:45.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:45.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:50.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:50.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:55.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:55.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 87e17aa744214038b0e10e8c5907965f 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 87e17aa744214038b0e10e8c5907965f 2025-10-09 15:01:56.700 2 ERROR oslo_service.periodic_task 2025-10-09 15:01:56.702 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:01:56.702 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 15:01:56.703 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 15:02:00.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:00.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:05.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:05.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:10.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:02:15.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:15.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:20.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:20.409 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:02:25.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:25.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:30.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:30.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:35.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:35.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:40.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:40.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:45.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:50.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:55.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:55.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 49c97669c8c84087a1f10b940456854f 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host( 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 49c97669c8c84087a1f10b940456854f 2025-10-09 15:02:56.706 2 ERROR oslo_service.periodic_task 2025-10-09 15:02:56.708 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:02:56.708 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:02:56.709 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:02:56.709 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:03:00.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:00.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:05.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:05.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:10.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:10.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:15.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:15.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:20.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:20.412 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:03:25.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:30.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:30.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:35.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:40.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:40.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:45.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:03:45.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:45.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:03:45.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:03:45.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:03:45.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:50.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:55.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1a12cad7a2dd4553b27475c790b78070 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1a12cad7a2dd4553b27475c790b78070 2025-10-09 15:03:56.712 2 ERROR oslo_service.periodic_task 2025-10-09 15:03:56.714 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:03:56.714 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:03:56.714 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 15:03:56.714 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:04:00.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:00.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:05.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:05.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:10.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:04:15.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:20.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:20.416 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:04:25.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:30.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:35.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:40.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:45.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:50.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:50.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:55.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:55.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 83c09984228642b2be178d1851c67011 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 83c09984228642b2be178d1851c67011 2025-10-09 15:04:56.719 2 ERROR oslo_service.periodic_task 2025-10-09 15:04:56.721 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:04:56.722 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 15:05:00.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:00.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:05.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:05.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:10.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:10.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:15.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:15.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:20.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:05:20.419 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:05:25.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:30.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:30.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:35.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:35.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:40.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:45.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:05:50.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:55.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 19028cb861d34dfa9ae693afbcb4dd02 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters( 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 19028cb861d34dfa9ae693afbcb4dd02 2025-10-09 15:05:56.725 2 ERROR oslo_service.periodic_task 2025-10-09 15:05:56.727 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:05:56.728 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 15:06:00.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:00.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:05.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:05.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:10.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:06:10.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:06:10.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:06:10.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:06:10.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:10.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:06:15.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:06:15.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:15.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:06:15.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:06:15.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:06:15.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:20.426 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-09 15:06:20.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:25.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:25.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:06:30.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:30.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:35.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:40.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:40.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:45.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:50.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:50.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:55.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d3fa186cbe7c4d1f9e475e23b68a931b 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context, 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID d3fa186cbe7c4d1f9e475e23b68a931b 2025-10-09 15:06:56.731 2 ERROR oslo_service.periodic_task 2025-10-09 15:06:56.733 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:07:00.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:00.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:05.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:05.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:10.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:10.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:15.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:15.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:20.432 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:07:20.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:20.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:25.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:25.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:30.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:35.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:40.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:45.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:07:50.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:55.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c29d1f2418984e45a62ad143bb85cb4a 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context) 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c29d1f2418984e45a62ad143bb85cb4a 2025-10-09 15:07:56.737 2 ERROR oslo_service.periodic_task 2025-10-09 15:08:00.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:00.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:05.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:05.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:10.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:10.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:15.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:15.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:20.439 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-09 15:08:20.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:08:24.277 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:08:24.278 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:08:25.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:08:30.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:08:35.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:40.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:45.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:45.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:50.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:08:55.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:05.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:10.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:10.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:09:15.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:20.443 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:09:20.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0c8e3a743bd7411cae9bc70e077641cc 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0c8e3a743bd7411cae9bc70e077641cc 2025-10-09 15:09:24.285 2 ERROR oslo_service.periodic_task 2025-10-09 15:09:24.287 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:09:24.288 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-09 15:09:24.288 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-09 15:09:25.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:09:30.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:09:30.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:09:30.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:09:30.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:09:30.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:30.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:09:30.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:09:35.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:35.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:40.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:45.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:50.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:09:55.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:00.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:05.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:10.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:15.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:20.446 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:10:20.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4d24ac7e293e4049a99c328aac1b9b07 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host( 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 4d24ac7e293e4049a99c328aac1b9b07 2025-10-09 15:10:24.294 2 ERROR oslo_service.periodic_task 2025-10-09 15:10:24.296 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:10:24.296 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:10:24.296 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:10:24.296 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:10:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:30.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:35.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:10:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:10:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:10:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:10:35.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:40.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:45.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:45.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:50.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:50.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:10:55.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:00.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:00.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:05.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:10.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:15.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:15.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:15.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:11:15.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:15.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:15.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:20.453 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-09 15:11:20.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:20.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b7fe64bf33a5407bb8e51d986e6bb10a 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b7fe64bf33a5407bb8e51d986e6bb10a 2025-10-09 15:11:24.303 2 ERROR oslo_service.periodic_task 2025-10-09 15:11:24.304 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:11:24.304 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:11:25.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:25.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:25.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:11:25.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:25.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:25.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:30.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:30.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:30.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:11:30.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:30.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:30.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:35.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:40.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:40.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:45.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:50.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:55.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:55.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:11:55.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:11:55.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:11:55.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:11:55.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:00.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:00.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:05.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:05.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:10.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:12:10.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:12:10.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:12:10.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:10.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:10.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:15.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:12:15.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:15.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:12:15.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:15.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:15.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:20.457 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:12:20.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 617def7543ae481d9c246b42c55530ad 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 617def7543ae481d9c246b42c55530ad 2025-10-09 15:12:24.310 2 ERROR oslo_service.periodic_task 2025-10-09 15:12:24.311 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:12:24.312 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-09 15:12:24.312 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:12:25.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:25.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:30.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:35.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:40.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:12:40.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:12:40.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:12:40.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:40.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:12:40.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:45.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:45.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:50.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:50.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:12:55.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:00.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:00.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:05.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:10.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:10.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:10.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:13:10.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:10.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:10.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:10.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:15.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:15.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:15.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:13:15.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:15.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:15.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:20.464 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-09 15:13:20.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:20.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:20.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:13:20.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:20.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:20.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5297fbfd4f43486cb5c7890ea2bd7b9d 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 5297fbfd4f43486cb5c7890ea2bd7b9d 2025-10-09 15:13:24.319 2 ERROR oslo_service.periodic_task 2025-10-09 15:13:24.321 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:13:25.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:25.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:25.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:13:25.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:25.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:25.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:30.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:13:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:13:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:31.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:31.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:13:36.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:36.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:41.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:41.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:46.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:46.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:51.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:51.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:13:56.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:01.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:01.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:06.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:06.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:11.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:11.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:16.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:16.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:20.467 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:14:21.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:21.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._cleanup_running_deleted_instances: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2a7d32c41e8b40a3896c8e827ac9888d 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10323, in _cleanup_running_deleted_instances 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task instances = self._running_deleted_instances(context) 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10378, in _running_deleted_instances 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task instances = self._get_instances_on_driver(context, filters) 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 656, in _get_instances_on_driver 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task local_instances = objects.InstanceList.get_by_filters( 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 2a7d32c41e8b40a3896c8e827ac9888d 2025-10-09 15:14:24.328 2 ERROR oslo_service.periodic_task 2025-10-09 15:14:24.329 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:14:24.329 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-09 15:14:25.586 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown'. Trying again in 1 seconds.: amqp.exceptions.ConnectionForced: (0, 0): (320) CONNECTION_FORCED - broker forced connection closure with reason 'shutdown' 2025-10-09 15:14:25.587 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer 2025-10-09 15:14:26.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:26.615 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:26.623 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:27.196 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer 2025-10-09 15:14:27.213 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:28.230 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:28.651 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:28.652 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:31.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:31.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:31.249 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:31.739 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer 2025-10-09 15:14:32.674 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:32.675 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:36.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:36.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:36.270 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:38.711 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:38.712 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:41.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:41.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:43.287 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:46.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:46.751 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:46.752 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:48.776 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:14:51.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:14:51.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:52.303 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:56.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:56.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:14:56.780 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:14:56.781 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:01.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:01.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:03.332 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:05.822 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:15:06.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:06.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:08.808 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:08.808 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:11.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:11.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:16.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:16.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:16.361 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:20.471 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-09 15:15:21.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:21.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:22.852 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:22.853 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:22.868 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c4bf00a1d3eb4a1295361829238ebb72 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters( 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c4bf00a1d3eb4a1295361829238ebb72 2025-10-09 15:15:24.332 2 ERROR oslo_service.periodic_task 2025-10-09 15:15:24.333 2 DEBUG oslo_service.periodic_task [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-09 15:15:24.333 2 DEBUG nova.compute.manager [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-09 15:15:24.343 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:25.358 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:26.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:26.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:28.377 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:31.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:31.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:31.394 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:33.396 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:36.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:36.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:38.900 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:38.901 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:39.916 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:15:40.415 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:41.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:41.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:46.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:46.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:48.421 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:49.438 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:51.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:51.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:56.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:56.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:15:56.946 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:56.947 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:15:56.961 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:16:00.455 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:01.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:01.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:06.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:06.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:07.457 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:11.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:11.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:13.480 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:14.001 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:16:16.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:16:16.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:16:16.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:16:16.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:16:16.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:16.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:16:16.987 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:16.988 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:21.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:16:26.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:28.490 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:28.499 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:31.035 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:16:31.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:31.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:36.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:39.031 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:39.031 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:41.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:45.533 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:46.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:48.091 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:16:51.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:16:51.532 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:16:56.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:01.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:03.087 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:03.088 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:04.557 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:05.137 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:17:06.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:06.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:06.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:06.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:06.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:06.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:11.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:11.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:16.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:16.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:16.572 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 27.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:21.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:22.176 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:17:25.598 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:26.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:26.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:26.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:26.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:26.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:26.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:29.136 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:29.137 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:31.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:31.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:36.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:39.218 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:17:41.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:41.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:41.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:41.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:41.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:41.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:43.620 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 29.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:46.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:46.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:48.645 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:51.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:51.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:51.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:51.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:51.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:51.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:56.251 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:17:56.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:56.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:17:56.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:17:56.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:56.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:17:56.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:17:57.201 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:17:57.203 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:01.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:01.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:01.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:06.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:06.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:06.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:06.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:06.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:06.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:11.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:11.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:11.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:11.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:11.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:11.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:12.663 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:13.288 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:18:13.674 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 27.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:16.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:21.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:21.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:21.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:21.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:21.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:21.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:26.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:26.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:26.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:26.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:26.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:26.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:27.260 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:27.261 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:30.327 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:18:31.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:31.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:31.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:31.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:31.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:31.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:36.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:36.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:36.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:36.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:36.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:36.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:40.707 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 29.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:41.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:41.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:41.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:41.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:41.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:41.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:43.705 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:46.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:46.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:46.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:46.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:46.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:46.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:47.368 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:18:51.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:51.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:51.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:51.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:51.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:51.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:18:56.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:56.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:18:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:18:56.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:18:59.319 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:18:59.320 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:01.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:01.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:01.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:01.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:01.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:01.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:04.406 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:19:06.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:06.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:06.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:06.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:06.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:06.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:09.749 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:11.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:11.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:11.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:11.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:11.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:11.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:14.753 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:16.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:16.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:16.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:16.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:16.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:16.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:21.447 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:19:21.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:21.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:21.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:21.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:21.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:21.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:26.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:26.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:26.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:26.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:27.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:27.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:31.374 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:31.375 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: . Trying again in 1 seconds.: amqp.exceptions.RecoverableConnectionError: 2025-10-09 15:19:32.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:32.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:32.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:32.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:32.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:32.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:32.389 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:34.400 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:37.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:37.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:37.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:37.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:37.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:37.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:38.419 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:38.492 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:19:40.803 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:42.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:42.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:42.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:42.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:42.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:42.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:44.431 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:45.802 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:47.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:47.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:47.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:47.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:47.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:47.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:52.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:52.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:52.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:52.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:52.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:52.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:52.450 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:19:55.535 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:19:57.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:57.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:19:57.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:19:57.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:19:57.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:19:57.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:02.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:02.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:02.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:02.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:02.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:02.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:02.467 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:03.421 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:07.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:07.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:07.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:07.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:07.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:07.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:11.838 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:12.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:12.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:12.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:12.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:12.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:12.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:12.576 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:20:14.490 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:16.847 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:17.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:17.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:17.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:17.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:17.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:17.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:22.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:22.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:22.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:22.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:22.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:22.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:27.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:27.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:27.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:27.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:27.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:27.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:28.521 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:29.621 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:20:32.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:32.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:32.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:32.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:32.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:32.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:35.488 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [00e89026-9a19-4319-bb98-464f734dd4a5] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:37.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:37.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:37.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:42.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:42.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:42.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:42.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:42.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:42.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:42.885 2 ERROR oslo.messaging._drivers.impl_rabbit [req-7baa50ce-2a1d-4b44-b7a5-1d1cb4426f1e - - - - -] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:44.545 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [24791929-3301-47b6-8e55-cc46f9b9efe3] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:46.665 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-09 15:20:47.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:47.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:47.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:47.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:47.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:47.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:47.888 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-09 15:20:52.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:52.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:52.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:52.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:52.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:52.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:57.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:57.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-09 15:20:57.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-09 15:20:57.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-09 15:20:57.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-09 15:20:57.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519