2025-10-06 15:59:45.237 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-06 15:59:45.238 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-06 15:59:45.238 2 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44 2025-10-06 15:59:45.238 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs 2025-10-06 15:59:45.290 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 15:59:45.314 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 15:59:45.834 2 INFO nova.virt.driver [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Loading compute driver 'libvirt.LibvirtDriver' 2025-10-06 15:59:45.949 2 INFO nova.compute.provider_config [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] No provider configs found in /etc/nova/provider_config. If files are present, ensure the Nova process has access. 2025-10-06 15:59:45.958 2 DEBUG oslo_concurrency.lockutils [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 15:59:45.958 2 DEBUG oslo_concurrency.lockutils [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 15:59:45.958 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:363 2025-10-06 15:59:45.958 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2593 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2595 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] config files: ['/usr/share/nova/nova-dist.conf', '/etc/nova/nova.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2596 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2598 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.959 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.960 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] config_dir = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] config_file = ['/usr/share/nova/nova-dist.conf', '/etc/nova/nova.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] console_host = standalone.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.961 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cpu_allocation_ratio = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.962 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] disk_allocation_ratio = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.963 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] force_config_drive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] host = standalone.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] initial_cpu_allocation_ratio = 16.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.964 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] initial_disk_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] initial_ram_allocation_ratio = 1.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] injected_network_template = /usr/share/nova/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.965 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_usage_audit = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_usage_audit_period = hour log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.966 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_dir = /var/log/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.967 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] logging_user_identity_format = %(user)s %(tenant)s %(domain)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.968 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.969 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] mkisofs_cmd = mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] my_block_storage_ip = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] my_ip = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.970 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.971 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.972 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.973 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.974 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.975 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.976 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.976 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.976 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.976 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.976 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.977 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.978 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.979 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2606 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.980 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.981 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.981 2 WARNING oslo_config.cfg [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Deprecated: Option "dhcp_domain" from group "DEFAULT" is deprecated. Use option "dhcp_domain" from group "api". 2025-10-06 15:59:45.981 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.981 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.981 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.982 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.983 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.backend = dogpile.cache.memcached log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.984 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.985 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.986 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_servers = ['standalone.internalapi.localdomain:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.987 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.988 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.auth_type = v3password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.989 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.990 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.991 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.packing_host_numa_cells_allocation_strategy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.provider_config_location = /etc/nova/provider_config log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.992 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.993 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.994 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.995 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.996 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.max_overflow = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.max_pool_size = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.997 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] devices.enabled_vgpu_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.998 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:45.999 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.000 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.001 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.002 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.003 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.004 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.005 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.006 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.006 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.006 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.006 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.007 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.008 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.009 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.010 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.auth_endpoint = http://172.17.0.2:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.011 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.barbican_endpoint = http://172.17.0.2:9311 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.barbican_endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.012 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.013 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.014 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.015 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.016 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.017 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.018 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.disk_cachemodes = ['network=writeback'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.019 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.hw_disk_discard = unmap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.hw_machine_type = ['x86_64=pc-q35-rhel9.0.0'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.020 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.021 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.022 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.023 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_inbound_addr = standalone.internalapi.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.023 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.023 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.023 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.024 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_timeout_action = abort log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.024 2 WARNING oslo_config.cfg [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Deprecated: Option "live_migration_tunnelled" from group "libvirt" is deprecated for removal ( The "tunnelled live migration" has two inherent limitations: it cannot handle live migration of disks in a non-shared storage setup; and it has a huge performance cost. Both these problems are solved by ``live_migration_with_native_tls`` (requires a pre-configured TLS environment), which is the recommended approach for securing all live migration streams.). Its value may be silently ignored in the future. 2025-10-06 15:59:46.024 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.024 2 WARNING oslo_config.cfg [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( live_migration_uri is deprecated for removal in favor of two other options that allow to change live migration scheme and target URI: ``live_migration_scheme`` and ``live_migration_inbound_addr`` respectively. ). Its value may be silently ignored in the future. 2025-10-06 15:59:46.024 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_uri = qemu+ssh://nova_migration@%s:2022/system?keyfile=/etc/nova/migration/identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.025 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_pcie_ports = 16 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.026 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rbd_secret_uuid = 875a4e5c-1cd1-50e8-b535-172a78890e57 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.027 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.028 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.029 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.swtpm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.030 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.volume_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.031 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.032 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.auth_type = v3password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.033 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.default_floating_pool = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.034 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.service_metadata_proxy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.035 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.timeout = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.036 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.037 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] pci.passthrough_whitelist = ['{"devname":"dummy-dev","physical_network":"dummy_sriov_net"}'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.auth_url = http://172.17.0.2:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.038 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.039 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.040 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.041 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.042 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.username = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] powervm.disk_driver = localdisk log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] powervm.proc_units_factor = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.043 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] powervm.volume_group_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.044 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.045 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.046 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.query_placement_for_availability_zone = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.047 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.048 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.049 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.050 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.051 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.052 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.052 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.053 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.054 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.055 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.056 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] upgrade_levels.compute = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.057 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.058 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.059 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.060 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.061 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.062 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.novncproxy_base_url = http://172.21.0.2:6080/vnc_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.063 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.server_listen = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.server_proxyclient_address = 172.17.0.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.064 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_native_luksv1 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.065 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.enable_qemu_monitor_announce_self = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.066 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.rbd_volume_local_attach = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.reserve_disk_resource_for_image_cache = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.skip_cpu_compare_on_dest = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.067 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.068 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.069 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.070 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.071 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.use_db_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] database.use_tpool = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.072 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.073 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.074 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.075 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.076 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.077 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.078 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.079 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.080 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.081 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.082 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.082 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.082 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.082 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.082 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.default_qos_type = linux-noop log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.083 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.084 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.085 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.086 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.086 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.086 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.086 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 2025-10-06 15:59:46.086 2 DEBUG oslo_service.service [req-8599ff1d-e037-4e7d-8d55-01719ff3d4fd - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2617 2025-10-06 15:59:46.087 2 INFO nova.service [-] Starting compute node (version 23.2.3-17.1.20250522071028.2ace99d.el9ost) 2025-10-06 15:59:46.114 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:491 2025-10-06 15:59:46.115 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:497 2025-10-06 15:59:46.115 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:619 2025-10-06 15:59:46.115 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:502 2025-10-06 15:59:46.129 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:508 2025-10-06 15:59:46.142 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:529 2025-10-06 15:59:46.148 2 INFO nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Connection event '1' reason 'None' 2025-10-06 15:59:46.172 2 WARNING nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Cannot update service status on host "standalone.localdomain" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host standalone.localdomain could not be found. 2025-10-06 15:59:46.173 2 DEBUG nova.virt.libvirt.volume.mount [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130 2025-10-06 15:59:46.994 2 INFO nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host capabilities 596793c4-40a4-4b01-8c11-5928f5df12e4 x86_64 EPYC-Rome AMD tcp rdma 16116612 4029153 0 0 selinux 0 system_u:system_r:svirt_t:s0 system_u:system_r:svirt_tcg_t:s0 dac 0 +107:+107 +107:+107 hvm 32 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel8.6.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 q35 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 hvm 64 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel8.6.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 q35 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 2025-10-06 15:59:46.998 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:939 2025-10-06 15:59:47.026 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: /usr/libexec/qemu-kvm kvm pc-i440fx-rhel7.6.0 i686 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun ide fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-06 15:59:47.029 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.2.0 i686 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-06 15:59:47.029 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc-q35-rhel9.0.0', 'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:939 2025-10-06 15:59:47.033 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc-q35-rhel9.0.0: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.0.0 x86_64 efi /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd /usr/share/edk2/ovmf/OVMF_CODE.fd /usr/share/edk2/ovmf/OVMF.amdsev.fd /usr/share/edk2/ovmf/OVMF.inteltdx.fd rom pflash yes no yes no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-06 15:59:47.036 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: /usr/libexec/qemu-kvm kvm pc-i440fx-rhel7.6.0 x86_64 /usr/share/OVMF/OVMF_CODE.secboot.fd rom pflash yes no no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun ide fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-06 15:59:47.039 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: /usr/libexec/qemu-kvm kvm pc-q35-rhel9.2.0 x86_64 efi /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd /usr/share/edk2/ovmf/OVMF_CODE.fd /usr/share/edk2/ovmf/OVMF.amdsev.fd /usr/share/edk2/ovmf/OVMF.inteltdx.fd rom pflash yes no yes no on off on off EPYC-Rome AMD qemu64 qemu32 phenom pentium3 pentium2 pentium n270 kvm64 kvm32 coreduo core2duo athlon Westmere-IBRS Westmere Snowridge Skylake-Server-noTSX-IBRS Skylake-Server-IBRS Skylake-Server Skylake-Client-noTSX-IBRS Skylake-Client-IBRS Skylake-Client SapphireRapids SandyBridge-IBRS SandyBridge Penryn Opteron_G5 Opteron_G4 Opteron_G3 Opteron_G2 Opteron_G1 Nehalem-IBRS Nehalem IvyBridge-IBRS IvyBridge Icelake-Server-noTSX Icelake-Server Haswell-noTSX-IBRS Haswell-noTSX Haswell-IBRS Haswell EPYC-Rome EPYC-Milan EPYC-IBPB EPYC-Genoa EPYC Dhyana Cooperlake Conroe Cascadelake-Server-noTSX Cascadelake-Server Broadwell-noTSX-IBRS Broadwell-noTSX Broadwell-IBRS Broadwell 486 file anonymous memfd disk cdrom floppy lun fdc scsi virtio usb sata virtio virtio-transitional virtio-non-transitional vnc egl-headless subsystem default mandatory requisite optional usb pci scsi virtio virtio-transitional virtio-non-transitional random egd builtin path handle virtiofs tpm-tis tpm-crb emulator external 2.0 usb pty unix relaxed vapic spinlocks vpindex runtime synic stimer reset vendor_id frequencies reenlightenment tlbflush ipi avic _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1024 2025-10-06 15:59:47.040 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1592 2025-10-06 15:59:47.040 2 INFO nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Secure Boot support detected 2025-10-06 15:59:47.041 2 INFO nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use. 2025-10-06 15:59:47.041 2 INFO nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use. 2025-10-06 15:59:47.080 2 INFO nova.virt.node [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Generated node identity 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 15:59:47.081 2 INFO nova.virt.node [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Wrote node identity 746a1a56-6a22-48dd-85a9-45922719c8f6 to /var/lib/nova/compute_id 2025-10-06 15:59:47.098 2 WARNING nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Compute nodes ['746a1a56-6a22-48dd-85a9-45922719c8f6'] for host standalone.localdomain were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. 2025-10-06 15:59:47.134 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host 2025-10-06 15:59:47.166 2 WARNING nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] No compute node record found for host standalone.localdomain. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host standalone.localdomain could not be found. 2025-10-06 15:59:47.167 2 DEBUG oslo_concurrency.lockutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 15:59:47.167 2 DEBUG oslo_concurrency.lockutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 15:59:47.168 2 DEBUG nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 15:59:47.168 2 DEBUG oslo_concurrency.processutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 15:59:47.662 2 DEBUG oslo_concurrency.processutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 15:59:47.821 2 WARNING nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 15:59:47.822 2 DEBUG nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4631MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 15:59:47.823 2 DEBUG oslo_concurrency.lockutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 15:59:47.832 2 WARNING nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] No compute node record for standalone.localdomain:746a1a56-6a22-48dd-85a9-45922719c8f6: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 746a1a56-6a22-48dd-85a9-45922719c8f6 could not be found. 2025-10-06 15:59:47.850 2 INFO nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Compute node record created for standalone.localdomain:standalone.localdomain with uuid: 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 15:59:47.891 2 DEBUG nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 15:59:47.891 2 DEBUG nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 15:59:48.294 2 INFO nova.scheduler.client.report [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [req-cb3a7561-d818-463b-95d2-8aa3562ba62d] Created resource provider record via placement API for resource provider with UUID 746a1a56-6a22-48dd-85a9-45922719c8f6 and name standalone.localdomain. 2025-10-06 15:59:48.296 2 DEBUG oslo_concurrency.processutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 15:59:48.758 2 DEBUG oslo_concurrency.processutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 15:59:48.765 2 DEBUG nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] /sys/module/kvm_amd/parameters/sev contains [N ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1613 2025-10-06 15:59:48.765 2 INFO nova.virt.libvirt.host [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] kernel doesn't support AMD SEV 2025-10-06 15:59:48.766 2 DEBUG nova.compute.provider_tree [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Updating inventory in ProviderTree for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 6, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-06 15:59:48.767 2 DEBUG nova.virt.libvirt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-06 15:59:48.823 2 DEBUG nova.scheduler.client.report [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Updated inventory for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 6, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957 2025-10-06 15:59:48.824 2 DEBUG nova.compute.provider_tree [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Updating resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:163 2025-10-06 15:59:48.824 2 DEBUG nova.compute.provider_tree [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Updating inventory in ProviderTree for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-06 15:59:48.942 2 DEBUG nova.compute.provider_tree [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Updating resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:163 2025-10-06 15:59:48.942 2 DEBUG nova.compute.resource_tracker [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 15:59:48.943 2 DEBUG oslo_concurrency.lockutils [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 15:59:48.943 2 DEBUG nova.service [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182 2025-10-06 15:59:48.991 2 DEBUG nova.service [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199 2025-10-06 15:59:48.991 2 DEBUG nova.servicegroup.drivers.db [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] DB_Driver: join new ServiceGroup member standalone.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44 2025-10-06 16:00:07.994 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:08.026 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.354 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:00:45.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:00:45.408 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:00:45.408 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.408 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.409 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.409 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.426 2 INFO nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running instance usage audit for host standalone.localdomain from 2025-10-06 15:00:00 to 2025-10-06 16:00:00. 0 instances. 2025-10-06 16:00:45.449 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.450 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.450 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:00:45.450 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:00:45.463 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:00:45.463 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:00:45.463 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:00:45.467 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:00:45.938 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:00:46.176 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:00:46.178 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4373MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:00:46.178 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:00:46.229 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:00:46.229 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:00:46.257 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:00:46.759 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:00:46.790 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:00:46.814 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:00:46.815 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:00:46.815 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:01:46.738 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:46.739 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:46.766 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:46.766 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:46.767 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:46.806 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:01:46.806 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:01:46.806 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:01:46.806 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:01:47.297 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:01:47.723 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:01:47.725 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4771MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:01:47.726 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:01:47.809 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:01:47.810 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:01:47.813 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:01:48.294 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:01:48.301 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:01:48.318 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:01:48.319 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:01:48.320 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:01:48.953 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:48.953 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:01:48.953 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:01:48.970 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:01:48.970 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:48.974 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:48.974 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:48.995 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:01:48.996 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:02:45.363 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:46.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:46.401 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:47.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:47.424 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:02:47.424 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:02:47.424 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:02:47.425 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:02:47.900 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:02:48.047 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:02:48.048 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4540MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:02:48.049 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:02:48.085 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:02:48.086 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:02:48.104 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:02:48.517 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:02:48.527 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:02:48.541 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:02:48.541 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:02:48.541 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:02:49.543 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:49.544 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:02:49.544 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:02:49.565 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:02:49.566 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:49.566 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:49.566 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:49.582 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:02:49.583 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:03:45.359 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:46.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:46.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:48.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:48.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:48.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:48.430 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:03:48.431 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:03:48.431 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:03:48.431 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:03:49.002 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:03:49.249 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:03:49.250 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4436MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:03:49.250 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:03:49.341 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:03:49.342 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:03:49.345 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:03:49.836 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:03:49.844 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:03:49.858 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:03:49.858 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:03:49.859 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:03:51.861 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:51.862 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:03:51.862 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:03:51.872 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:03:51.872 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:51.872 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:51.878 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:03:51.878 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:04:45.337 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:45.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:45.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:04:45.414 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-06 16:04:45.415 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:45.415 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-06 16:04:45.426 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:47.434 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:50.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:50.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:50.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:50.422 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:04:50.422 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:04:50.423 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:04:50.423 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:04:50.859 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:04:51.008 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:04:51.010 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4771MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:04:51.010 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:04:51.105 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:04:51.105 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:04:51.130 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing inventories for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804 2025-10-06 16:04:51.151 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Updating ProviderTree inventory for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768 2025-10-06 16:04:51.151 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Updating inventory in ProviderTree for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-06 16:04:51.171 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing aggregate associations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813 2025-10-06 16:04:51.196 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing trait associations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_F16C,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI2,HW_CPU_X86_SSSE3,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825 2025-10-06 16:04:51.197 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:04:51.686 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:04:51.694 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:04:51.708 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:04:51.709 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:04:51.710 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:04:52.718 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:52.719 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:04:52.719 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:04:52.736 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:04:52.736 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:52.737 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:53.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:04:53.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:05:46.322 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:48.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:49.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:50.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:51.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:51.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:51.415 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:05:51.415 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:05:51.416 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:05:51.416 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:05:51.901 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:05:52.138 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:05:52.140 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4691MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:05:52.141 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:05:52.206 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:05:52.207 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:05:52.209 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:05:52.666 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:05:52.672 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:05:52.697 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:05:52.697 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:05:52.698 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:05:53.697 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:53.698 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:54.416 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:54.416 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:05:54.416 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:05:54.433 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:05:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:05:55.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:06:48.322 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:49.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:51.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:51.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:51.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:51.424 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:06:51.425 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:06:51.425 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:06:51.426 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:06:51.903 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:06:52.128 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:06:52.130 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4724MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:06:52.130 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:06:52.188 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:06:52.188 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:06:52.191 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:06:52.691 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:06:52.697 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:06:52.712 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:06:52.713 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:06:52.713 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:06:53.713 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:53.714 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:55.412 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:55.413 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:06:55.413 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:06:55.429 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:06:57.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:06:57.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:07:49.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:50.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:52.325 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:52.401 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:52.401 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:52.412 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:52.440 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:07:52.441 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:07:52.441 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:07:52.441 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:07:52.944 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:07:53.164 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:07:53.165 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4438MB free_disk=6.99609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:07:53.166 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:07:53.212 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:07:53.213 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=512MB phys_disk=6GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:07:53.214 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:07:53.734 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:07:53.741 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:07:53.754 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:07:53.755 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:07:53.755 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:07:54.744 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:54.744 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:56.401 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:56.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:07:56.402 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:07:56.424 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9600 2025-10-06 16:07:57.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:07:57.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:08:06.969 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:06.983 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2306 2025-10-06 16:08:07.084 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:07.087 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2200 2025-10-06 16:08:07.087 2 INFO nova.compute.claims [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Claim successful on node standalone.localdomain 2025-10-06 16:08:07.220 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:07.660 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:07.666 2 DEBUG nova.compute.provider_tree [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:08:07.676 2 DEBUG nova.scheduler.client.report [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:08:07.676 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:07.676 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2679 2025-10-06 16:08:07.726 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1845 2025-10-06 16:08:07.726 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1020 2025-10-06 16:08:07.739 2 DEBUG nova.block_device [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] block_device_list [] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-06 16:08:07.753 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2714 2025-10-06 16:08:07.842 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2510 2025-10-06 16:08:07.842 2 DEBUG nova.block_device [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-06 16:08:07.843 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4592 2025-10-06 16:08:07.844 2 INFO nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Creating image 2025-10-06 16:08:07.894 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:07.916 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:07.935 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:07.939 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ca66e6c0826faa657bf7e913778ff2c76a4be080" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:08.207 2 DEBUG nova.virt.libvirt.imagebackend [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Image locations are: [{'url': 'rbd://875a4e5c-1cd1-50e8-b535-172a78890e57/images/ff3d6c10-010e-401c-9389-48fe10b80f3e/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://875a4e5c-1cd1-50e8-b535-172a78890e57/images/ff3d6c10-010e-401c-9389-48fe10b80f3e/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1046 2025-10-06 16:08:08.288 2 WARNING oslo_policy.policy [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html. 2025-10-06 16:08:08.288 2 WARNING oslo_policy.policy [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html. 2025-10-06 16:08:08.570 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:08.654 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.part --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:08.655 2 DEBUG nova.virt.images [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] ff3d6c10-010e-401c-9389-48fe10b80f3e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242 2025-10-06 16:08:08.656 2 DEBUG nova.privsep.utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63 2025-10-06 16:08:08.656 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.part /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:08.973 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Successfully created port: e6ead84d-622e-4356-93d8-2f0ae5455764 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:549 2025-10-06 16:08:09.050 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.part /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.converted" returned: 0 in 0.394s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:09.052 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:09.112 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080.converted --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:09.113 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ca66e6c0826faa657bf7e913778ff2c76a4be080" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:09.150 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:09.154 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080 4fd14755-66e2-4403-a5fb-0f03557d253f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:10.069 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Successfully updated port: e6ead84d-622e-4356-93d8-2f0ae5455764 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:587 2025-10-06 16:08:10.087 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:08:10.087 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1872 2025-10-06 16:08:10.164 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3049 2025-10-06 16:08:10.460 2 DEBUG nova.compute.manager [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Received event network-changed-e6ead84d-622e-4356-93d8-2f0ae5455764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:08:10.461 2 DEBUG nova.compute.manager [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Refreshing instance network info cache due to event network-changed-e6ead84d-622e-4356-93d8-2f0ae5455764. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-06 16:08:10.483 2 DEBUG nova.network.neutron [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:08:10.518 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:08:10.519 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Instance network_info: |[{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1859 2025-10-06 16:08:10.519 2 DEBUG oslo_concurrency.lockutils [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:08:10.520 2 DEBUG nova.network.neutron [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Refreshing network info cache for port e6ead84d-622e-4356-93d8-2f0ae5455764 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-06 16:08:10.994 2 DEBUG nova.network.neutron [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated VIF entry in instance network info cache for port e6ead84d-622e-4356-93d8-2f0ae5455764. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-06 16:08:10.994 2 DEBUG nova.network.neutron [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:08:11.006 2 DEBUG oslo_concurrency.lockutils [req-7d002335-30de-4646-af65-431809536b7b e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:08:11.051 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ca66e6c0826faa657bf7e913778ff2c76a4be080 4fd14755-66e2-4403-a5fb-0f03557d253f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.896s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:11.122 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] resizing rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:286 2025-10-06 16:08:11.217 2 DEBUG nova.objects.instance [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lazy-loading 'migration_context' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:08:11.250 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:11.273 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:11.277 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:11.277 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:11.297 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:11.298 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:11.331 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:11.332 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ephemeral_1_0706d66" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:11.353 2 DEBUG nova.storage.rbd_utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 4fd14755-66e2-4403-a5fb-0f03557d253f_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:08:11.356 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 4fd14755-66e2-4403-a5fb-0f03557d253f_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:12.151 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 4fd14755-66e2-4403-a5fb-0f03557d253f_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.795s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:12.257 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719 2025-10-06 16:08:12.258 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Ensure instance console log exists: /var/lib/nova/instances/4fd14755-66e2-4403-a5fb-0f03557d253f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4482 2025-10-06 16:08:12.260 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:12.260 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "vgpu_resources" released by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:12.262 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Start _get_guest_xml network_info=[{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'type': 'disk', 'dev': 'vda', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'type': 'disk', 'dev': 'vda', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2025-10-06T16:07:06Z,direct_url=,disk_format='qcow2',id=ff3d6c10-010e-401c-9389-48fe10b80f3e,min_disk=0,min_ram=0,name='cirros',owner='64e40da08e4a44b0890ef4f0195eaaa4',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2025-10-06T16:07:08Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'ephemerals': [{'device_name': '/dev/vdb', 'guest_format': None, 'disk_bus': 'virtio', 'size': 1, 'device_type': 'disk'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7177 2025-10-06 16:08:12.267 2 WARNING nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:08:12.270 2 DEBUG nova.virt.libvirt.host [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1463 2025-10-06 16:08:12.271 2 DEBUG nova.virt.libvirt.host [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1473 2025-10-06 16:08:12.273 2 DEBUG nova.virt.libvirt.host [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1482 2025-10-06 16:08:12.273 2 DEBUG nova.virt.libvirt.host [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1489 2025-10-06 16:08:12.274 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-06 16:08:12.275 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T16:07:13Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='db493b7d-b93a-4c5e-8afe-780886438f8d',id=3,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='87617e24a5e30cb3b87fda8c0764838f',container_format='bare',created_at=2025-10-06T16:07:06Z,direct_url=,disk_format='qcow2',id=ff3d6c10-010e-401c-9389-48fe10b80f3e,min_disk=0,min_ram=0,name='cirros',owner='64e40da08e4a44b0890ef4f0195eaaa4',properties=ImageMetaProps,protected=,size=21692416,status='active',tags=,updated_at=2025-10-06T16:07:08Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:558 2025-10-06 16:08:12.276 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:343 2025-10-06 16:08:12.276 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:347 2025-10-06 16:08:12.276 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:383 2025-10-06 16:08:12.277 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:387 2025-10-06 16:08:12.277 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:425 2025-10-06 16:08:12.277 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:564 2025-10-06 16:08:12.278 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:466 2025-10-06 16:08:12.278 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:496 2025-10-06 16:08:12.278 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:570 2025-10-06 16:08:12.279 2 DEBUG nova.virt.hardware [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:572 2025-10-06 16:08:12.282 2 DEBUG nova.privsep.utils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63 2025-10-06 16:08:12.282 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:12.846 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.564s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:12.848 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:13.352 2 DEBUG oslo_concurrency.processutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:13.354 2 DEBUG nova.virt.libvirt.vif [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-06T16:08:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='test',display_name='test',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='test',id=1,image_ref='ff3d6c10-010e-401c-9389-48fe10b80f3e',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e40da08e4a44b0890ef4f0195eaaa4',ramdisk_id='',reservation_id='r-blcu6jl3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='ff3d6c10-010e-401c-9389-48fe10b80f3e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T16:08:07Z,user_data=None,user_id='00a0ae66495a4319b62d3402a43653ff',uuid=4fd14755-66e2-4403-a5fb-0f03557d253f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:566 2025-10-06 16:08:13.355 2 DEBUG nova.network.os_vif_util [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converting VIF {"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-06 16:08:13.359 2 DEBUG nova.network.os_vif_util [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:db:8e,bridge_name='br-int',has_traffic_filtering=True,id=e6ead84d-622e-4356-93d8-2f0ae5455764,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ead84d-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-06 16:08:13.362 2 DEBUG nova.objects.instance [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lazy-loading 'pci_devices' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:08:13.385 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] End _get_guest_xml xml= 4fd14755-66e2-4403-a5fb-0f03557d253f instance-00000001 524288 1 test 2025-10-06 16:08:12 512 1 0 1 1 admin admin Red Hat OpenStack Compute 23.2.3-17.1.20250522071028.2ace99d.el9ost 4fd14755-66e2-4403-a5fb-0f03557d253f 4fd14755-66e2-4403-a5fb-0f03557d253f Virtual Machine hvm /dev/urandom _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7183 2025-10-06 16:08:13.386 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Preparing to wait for external event network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:282 2025-10-06 16:08:13.386 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:13.387 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" released by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:13.388 2 DEBUG nova.virt.libvirt.vif [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-06T16:08:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='test',display_name='test',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='test',id=1,image_ref='ff3d6c10-010e-401c-9389-48fe10b80f3e',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e40da08e4a44b0890ef4f0195eaaa4',ramdisk_id='',reservation_id='r-blcu6jl3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='ff3d6c10-010e-401c-9389-48fe10b80f3e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T16:08:07Z,user_data=None,user_id='00a0ae66495a4319b62d3402a43653ff',uuid=4fd14755-66e2-4403-a5fb-0f03557d253f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:712 2025-10-06 16:08:13.389 2 DEBUG nova.network.os_vif_util [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converting VIF {"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-06 16:08:13.391 2 DEBUG nova.network.os_vif_util [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:db:8e,bridge_name='br-int',has_traffic_filtering=True,id=e6ead84d-622e-4356-93d8-2f0ae5455764,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ead84d-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-06 16:08:13.392 2 DEBUG os_vif [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:db:8e,bridge_name='br-int',has_traffic_filtering=True,id=e6ead84d-622e-4356-93d8-2f0ae5455764,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ead84d-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76 2025-10-06 16:08:13.562 2 DEBUG ovsdbapp.backend.ovs_idl [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-06 16:08:13.562 2 DEBUG ovsdbapp.backend.ovs_idl [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-06 16:08:13.563 2 DEBUG ovsdbapp.backend.ovs_idl [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:126 2025-10-06 16:08:13.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:08:13.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [POLLOUT] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:08:13.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.582 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:08:13.582 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 2025-10-06 16:08:13.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.583 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c6dadd97-f466-5410-b5e1-a5f9bafe3ba0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:08:13.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:13.587 2 INFO oslo.privsep.daemon [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/usr/share/nova/nova-dist.conf', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpplch6nx0/privsep.sock'] 2025-10-06 16:08:14.541 2 INFO oslo.privsep.daemon [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Spawned new privsep daemon via rootwrap 2025-10-06 16:08:14.425 957 INFO oslo.privsep.daemon [-] privsep daemon starting 2025-10-06 16:08:14.429 957 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2025-10-06 16:08:14.431 957 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN/CAP_NET_ADMIN/none 2025-10-06 16:08:14.432 957 INFO oslo.privsep.daemon [-] privsep daemon running as pid 957 2025-10-06 16:08:14.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:14.904 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(bridge=br-int, port=tape6ead84d-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:08:14.904 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(table=Port, record=tape6ead84d-62, col_values=(('qos', UUID('58c29672-3e45-47ab-94a0-1692856a0449')),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:08:14.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(table=Interface, record=tape6ead84d-62, col_values=(('external_ids', {'iface-id': 'e6ead84d-622e-4356-93d8-2f0ae5455764', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:db:8e', 'vm-uuid': '4fd14755-66e2-4403-a5fb-0f03557d253f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:08:14.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:14.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:08:14.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:14.949 2 INFO os_vif [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:db:8e,bridge_name='br-int',has_traffic_filtering=True,id=e6ead84d-622e-4356-93d8-2f0ae5455764,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6ead84d-62') 2025-10-06 16:08:14.987 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:08:14.987 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:08:14.987 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No VIF found with MAC fa:16:3e:ec:db:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-06 16:08:15.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.241 2 DEBUG nova.compute.manager [req-56882a64-6342-48a2-b4fd-425a24db728a e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Received event network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:08:15.241 2 DEBUG oslo_concurrency.lockutils [req-56882a64-6342-48a2-b4fd-425a24db728a e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:15.242 2 DEBUG oslo_concurrency.lockutils [req-56882a64-6342-48a2-b4fd-425a24db728a e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:15.242 2 DEBUG nova.compute.manager [req-56882a64-6342-48a2-b4fd-425a24db728a e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Processing event network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10449 2025-10-06 16:08:15.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:15.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:16.291 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:08:16.292 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] VM Started (Lifecycle Event) 2025-10-06 16:08:16.304 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4290 2025-10-06 16:08:16.336 2 INFO nova.virt.libvirt.driver [-] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Instance spawned successfully. 2025-10-06 16:08:16.337 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:852 2025-10-06 16:08:16.341 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:08:16.345 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-06 16:08:16.351 2 DEBUG nova.block_device [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-06 16:08:16.356 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.357 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.357 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.358 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.358 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.359 2 DEBUG nova.virt.libvirt.driver [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:08:16.411 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-06 16:08:16.412 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:08:16.412 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] VM Paused (Lifecycle Event) 2025-10-06 16:08:16.488 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:08:16.492 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:08:16.493 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] VM Resumed (Lifecycle Event) 2025-10-06 16:08:16.527 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:08:16.532 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-06 16:08:16.560 2 INFO nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Took 8.72 seconds to spawn the instance on the hypervisor. 2025-10-06 16:08:16.561 2 DEBUG nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:08:16.575 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-06 16:08:16.633 2 INFO nova.compute.manager [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Took 9.59 seconds to build instance. 2025-10-06 16:08:16.648 2 DEBUG oslo_concurrency.lockutils [req-6c6b2ffc-e604-4d6e-b06a-449ec697821f 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" released by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:17.286 2 DEBUG nova.compute.manager [req-5abffc2d-25db-4449-9299-30ff27beeb26 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Received event network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:08:17.286 2 DEBUG oslo_concurrency.lockutils [req-5abffc2d-25db-4449-9299-30ff27beeb26 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:17.287 2 DEBUG oslo_concurrency.lockutils [req-5abffc2d-25db-4449-9299-30ff27beeb26 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:17.287 2 DEBUG nova.compute.manager [req-5abffc2d-25db-4449-9299-30ff27beeb26 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] No waiting events found dispatching network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:319 2025-10-06 16:08:17.287 2 WARNING nova.compute.manager [req-5abffc2d-25db-4449-9299-30ff27beeb26 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Received unexpected event network-vif-plugged-e6ead84d-622e-4356-93d8-2f0ae5455764 for instance with vm_state active and task_state None. 2025-10-06 16:08:18.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:20.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:21.857 2 DEBUG nova.compute.manager [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Received event network-changed-e6ead84d-622e-4356-93d8-2f0ae5455764 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:08:21.858 2 DEBUG nova.compute.manager [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Refreshing instance network info cache due to event network-changed-e6ead84d-622e-4356-93d8-2f0ae5455764. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-06 16:08:21.858 2 DEBUG oslo_concurrency.lockutils [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:08:21.858 2 DEBUG nova.network.neutron [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Refreshing network info cache for port e6ead84d-622e-4356-93d8-2f0ae5455764 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-06 16:08:22.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:22.733 2 DEBUG nova.network.neutron [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated VIF entry in instance network info cache for port e6ead84d-622e-4356-93d8-2f0ae5455764. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-06 16:08:22.734 2 DEBUG nova.network.neutron [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:08:22.751 2 DEBUG oslo_concurrency.lockutils [req-445b873f-bf22-486e-8537-24ad082eff0c e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:08:23.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:25.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:28.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:30.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:33.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:35.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:38.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:40.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:43.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:48.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:49.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:50.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:50.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:53.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:53.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:53.421 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:53.422 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:53.422 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:08:53.423 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:53.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:53.903 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:53.986 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:08:53.986 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:08:54.214 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:08:54.216 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4105MB free_disk=6.950572967529297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:08:54.217 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:08:54.295 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:08:54.295 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:08:54.296 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=6GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:08:54.317 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:08:54.923 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:08:54.934 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:08:54.948 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:08:54.949 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:08:54.949 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:08:55.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:55.949 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:55.951 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:55.961 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:57.413 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:57.414 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:08:57.414 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:08:57.524 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:08:57.524 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:08:57.525 2 DEBUG nova.objects.instance [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:08:58.076 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:08:58.087 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:08:58.088 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:08:58.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:08:59.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:08:59.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:09:00.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:03.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:05.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:06.428 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:06.441 2 DEBUG nova.block_device [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] block_device_list ['vdb'] volume_in_mapping /usr/lib/python3.9/site-packages/nova/block_device.py:617 2025-10-06 16:09:06.441 2 DEBUG nova.objects.instance [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lazy-loading 'flavor' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:09:06.482 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" released by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:06.613 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:06.614 2 INFO nova.compute.manager [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Attaching volume 7de8e18d-73e1-41c5-9d9b-8106ba33ae87 to /dev/vdc 2025-10-06 16:09:06.707 2 DEBUG os_brick.utils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '172.17.0.100', 'multipath': False, 'enforce_multipath': True, 'host': 'standalone.localdomain', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:163 2025-10-06 16:09:06.708 2 INFO oslo.privsep.daemon [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/usr/share/nova/nova-dist.conf', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpcrbz85gq/privsep.sock'] 2025-10-06 16:09:07.476 2 INFO oslo.privsep.daemon [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Spawned new privsep daemon via rootwrap 2025-10-06 16:09:07.365 1026 INFO oslo.privsep.daemon [-] privsep daemon starting 2025-10-06 16:09:07.370 1026 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2025-10-06 16:09:07.374 1026 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none 2025-10-06 16:09:07.374 1026 INFO oslo.privsep.daemon [-] privsep daemon running as pid 1026 2025-10-06 16:09:07.478 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270817684992]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:07.607 1026 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:07.615 1026 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:07.615 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270817684992]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c679f7e627d\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:07.616 2 WARNING os_brick.initiator.connectors.nvmeof [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' 2025-10-06 16:09:07.617 1026 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:07.629 1026 DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:07.629 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270817684992]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:07.631 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270817684992]: (4, '596793c4-40a4-4b01-8c11-5928f5df12e4') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:07.631 2 DEBUG oslo_concurrency.processutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:07.655 2 DEBUG oslo_concurrency.processutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:07.657 2 DEBUG os_brick.utils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] <== get_connector_properties: return (949ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '172.17.0.100', 'host': 'standalone.localdomain', 'multipath': False, 'initiator': 'iqn.1994-05.com.redhat:c679f7e627d', 'do_local_attach': False, 'system uuid': '596793c4-40a4-4b01-8c11-5928f5df12e4', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:33069693-bb8b-4414-95bc-37b06483388d', 'nvme_native_multipath': False} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:190 2025-10-06 16:09:07.657 2 DEBUG nova.virt.block_device [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating existing volume attachment record: 4cdc3765-1b7c-4fdf-b48b-03bd91b3353c _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:569 2025-10-06 16:09:08.435 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:08.439 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "cache_volume_driver" released by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:08.445 2 DEBUG nova.objects.instance [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lazy-loading 'flavor' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:09:08.489 2 DEBUG nova.virt.libvirt.driver [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Attempting to attach volume 7de8e18d-73e1-41c5-9d9b-8106ba33ae87 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2108 2025-10-06 16:09:08.493 2 DEBUG nova.virt.libvirt.guest [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] attach device xml: 7de8e18d-73e1-41c5-9d9b-8106ba33ae87 attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:320 2025-10-06 16:09:08.601 2 DEBUG nova.virt.libvirt.driver [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:09:08.602 2 DEBUG nova.virt.libvirt.driver [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:09:08.602 2 DEBUG nova.virt.libvirt.driver [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:09:08.603 2 DEBUG nova.virt.libvirt.driver [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No VIF found with MAC fa:16:3e:ec:db:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-06 16:09:08.797 2 DEBUG oslo_concurrency.lockutils [req-607ed657-54c2-4f25-8f62-c4979ae96727 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" released by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 2.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:08.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:10.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:13.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:15.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:18.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:20.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:21.418 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:21.434 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2306 2025-10-06 16:09:21.522 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:21.528 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2200 2025-10-06 16:09:21.528 2 INFO nova.compute.claims [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Claim successful on node standalone.localdomain 2025-10-06 16:09:21.665 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:22.121 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:22.128 2 DEBUG nova.compute.provider_tree [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:09:22.141 2 DEBUG nova.scheduler.client.report [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:09:22.142 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:22.143 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2679 2025-10-06 16:09:22.213 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1845 2025-10-06 16:09:22.214 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1020 2025-10-06 16:09:22.224 2 INFO nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names 2025-10-06 16:09:22.258 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2714 2025-10-06 16:09:22.336 2 INFO nova.virt.block_device [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Booting with volume 023acf71-fdf4-4df1-b7a3-ee19894437ae at /dev/vda 2025-10-06 16:09:22.435 2 DEBUG os_brick.utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '172.17.0.100', 'multipath': False, 'enforce_multipath': True, 'host': 'standalone.localdomain', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:163 2025-10-06 16:09:22.437 1026 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:22.445 1026 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:22.445 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270815767008]: (4, ('InitiatorName=iqn.1994-05.com.redhat:c679f7e627d\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:22.447 1026 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:22.455 1026 DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:22.455 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270815767008]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:22.457 1026 DEBUG oslo.privsep.daemon [-] privsep: reply[140270815767008]: (4, '596793c4-40a4-4b01-8c11-5928f5df12e4') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:512 2025-10-06 16:09:22.457 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:22.470 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "nvme version" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:22.472 2 DEBUG os_brick.utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] <== get_connector_properties: return (35ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '172.17.0.100', 'host': 'standalone.localdomain', 'multipath': False, 'initiator': 'iqn.1994-05.com.redhat:c679f7e627d', 'do_local_attach': False, 'system uuid': '596793c4-40a4-4b01-8c11-5928f5df12e4', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:33069693-bb8b-4414-95bc-37b06483388d', 'nvme_native_multipath': False} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:190 2025-10-06 16:09:22.472 2 DEBUG nova.virt.block_device [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating existing volume attachment record: e98a1f3c-1ab2-44f8-9a8b-a2e0ab7b3147 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:569 2025-10-06 16:09:22.963 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Successfully created port: 1511a682-810a-45db-8cd9-bcc2c61b241b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:549 2025-10-06 16:09:23.414 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2510 2025-10-06 16:09:23.415 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4592 2025-10-06 16:09:23.415 2 INFO nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Creating image 2025-10-06 16:09:23.448 2 DEBUG nova.storage.rbd_utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:09:23.472 2 DEBUG nova.storage.rbd_utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:09:23.514 2 DEBUG nova.storage.rbd_utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:09:23.519 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:23.596 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:23.598 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:23.599 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "ephemeral_1_0706d66" released by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:23.632 2 DEBUG nova.storage.rbd_utils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] rbd image 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:79 2025-10-06 16:09:23.643 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:23.829 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Successfully updated port: 1511a682-810a-45db-8cd9-bcc2c61b241b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:587 2025-10-06 16:09:23.840 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Acquired lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:09:23.840 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1872 2025-10-06 16:09:23.871 2 DEBUG nova.compute.manager [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Received event network-changed-1511a682-810a-45db-8cd9-bcc2c61b241b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:09:23.871 2 DEBUG nova.compute.manager [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Refreshing instance network info cache due to event network-changed-1511a682-810a-45db-8cd9-bcc2c61b241b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10688 2025-10-06 16:09:23.919 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3049 2025-10-06 16:09:24.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.239 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ephemeral_1_0706d66 773e8581-e83f-432d-8be8-bb426e08df6c_disk.eph0 --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.596s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:24.290 2 DEBUG nova.network.neutron [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating instance_info_cache with network_info: [{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:09:24.333 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Releasing lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:09:24.334 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Instance network_info: |[{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1859 2025-10-06 16:09:24.334 2 DEBUG oslo_concurrency.lockutils [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Acquired lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:09:24.334 2 DEBUG nova.network.neutron [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Refreshing network info cache for port 1511a682-810a-45db-8cd9-bcc2c61b241b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1869 2025-10-06 16:09:24.345 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719 2025-10-06 16:09:24.347 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Ensure instance console log exists: /var/lib/nova/instances/773e8581-e83f-432d-8be8-bb426e08df6c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4482 2025-10-06 16:09:24.347 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:24.348 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "vgpu_resources" released by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:24.351 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Start _get_guest_xml network_info=[{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'ephemerals': [{'device_name': '/dev/vdb', 'guest_format': None, 'disk_bus': 'virtio', 'size': 1, 'device_type': 'disk'}], 'block_device_mapping': [{'delete_on_termination': False, 'mount_device': '/dev/vda', 'guest_format': None, 'disk_bus': 'virtio', 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-023acf71-fdf4-4df1-b7a3-ee19894437ae', 'hosts': ['172.18.0.100'], 'ports': ['6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '023acf71-fdf4-4df1-b7a3-ee19894437ae', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '773e8581-e83f-432d-8be8-bb426e08df6c', 'attached_at': '', 'detached_at': '', 'volume_id': '023acf71-fdf4-4df1-b7a3-ee19894437ae', 'serial': '023acf71-fdf4-4df1-b7a3-ee19894437ae'}, 'boot_index': 0, 'attachment_id': 'e98a1f3c-1ab2-44f8-9a8b-a2e0ab7b3147', 'device_type': 'disk', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7177 2025-10-06 16:09:24.359 2 WARNING nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:09:24.361 2 DEBUG nova.virt.libvirt.host [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1463 2025-10-06 16:09:24.361 2 DEBUG nova.virt.libvirt.host [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1473 2025-10-06 16:09:24.363 2 DEBUG nova.virt.libvirt.host [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Searching host: 'standalone.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1482 2025-10-06 16:09:24.363 2 DEBUG nova.virt.libvirt.host [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1489 2025-10-06 16:09:24.364 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5177 2025-10-06 16:09:24.364 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-06T16:07:13Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='db493b7d-b93a-4c5e-8afe-780886438f8d',id=3,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:558 2025-10-06 16:09:24.364 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:343 2025-10-06 16:09:24.364 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:347 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:383 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:387 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:425 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:564 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:466 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:496 2025-10-06 16:09:24.365 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:570 2025-10-06 16:09:24.366 2 DEBUG nova.virt.hardware [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:572 2025-10-06 16:09:24.366 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:24.785 2 DEBUG nova.network.neutron [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updated VIF entry in instance network info cache for port 1511a682-810a-45db-8cd9-bcc2c61b241b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3183 2025-10-06 16:09:24.787 2 DEBUG nova.network.neutron [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating instance_info_cache with network_info: [{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:09:24.810 2 DEBUG oslo_concurrency.lockutils [req-1f2d3076-a715-4447-8e56-cdd9623ea025 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Releasing lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:09:24.827 2 DEBUG oslo_concurrency.processutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:24.854 2 DEBUG nova.virt.libvirt.vif [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-06T16:09:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='bfv-server',display_name='bfv-server',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='bfv-server',id=2,image_ref='',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e40da08e4a44b0890ef4f0195eaaa4',ramdisk_id='',reservation_id='r-wecpmqw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',image_signature_verified='False',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T16:09:22Z,user_data=None,user_id='00a0ae66495a4319b62d3402a43653ff',uuid=773e8581-e83f-432d-8be8-bb426e08df6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:566 2025-10-06 16:09:24.855 2 DEBUG nova.network.os_vif_util [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converting VIF {"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-06 16:09:24.857 2 DEBUG nova.network.os_vif_util [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:4b:57,bridge_name='br-int',has_traffic_filtering=True,id=1511a682-810a-45db-8cd9-bcc2c61b241b,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1511a682-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-06 16:09:24.858 2 DEBUG nova.objects.instance [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lazy-loading 'pci_devices' on Instance uuid 773e8581-e83f-432d-8be8-bb426e08df6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:09:24.872 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] End _get_guest_xml xml= 773e8581-e83f-432d-8be8-bb426e08df6c instance-00000002 524288 1 bfv-server 2025-10-06 16:09:24 512 1 0 1 1 admin admin Red Hat OpenStack Compute 23.2.3-17.1.20250522071028.2ace99d.el9ost 773e8581-e83f-432d-8be8-bb426e08df6c 773e8581-e83f-432d-8be8-bb426e08df6c Virtual Machine hvm 023acf71-fdf4-4df1-b7a3-ee19894437ae /dev/urandom _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7183 2025-10-06 16:09:24.873 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Preparing to wait for external event network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:282 2025-10-06 16:09:24.874 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:24.874 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" released by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:24.875 2 DEBUG nova.virt.libvirt.vif [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-06T16:09:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='bfv-server',display_name='bfv-server',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(3),hidden=False,host='standalone.localdomain',hostname='bfv-server',id=2,image_ref='',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='standalone.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='standalone.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e40da08e4a44b0890ef4f0195eaaa4',ramdisk_id='',reservation_id='r-wecpmqw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='',image_hw_machine_type='pc-q35-rhel9.0.0',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',image_signature_verified='False',network_allocated='True',owner_project_name='admin',owner_user_name='admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-06T16:09:22Z,user_data=None,user_id='00a0ae66495a4319b62d3402a43653ff',uuid=773e8581-e83f-432d-8be8-bb426e08df6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:712 2025-10-06 16:09:24.875 2 DEBUG nova.network.os_vif_util [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converting VIF {"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:523 2025-10-06 16:09:24.876 2 DEBUG nova.network.os_vif_util [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:4b:57,bridge_name='br-int',has_traffic_filtering=True,id=1511a682-810a-45db-8cd9-bcc2c61b241b,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1511a682-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:560 2025-10-06 16:09:24.877 2 DEBUG os_vif [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:4b:57,bridge_name='br-int',has_traffic_filtering=True,id=1511a682-810a-45db-8cd9-bcc2c61b241b,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1511a682-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76 2025-10-06 16:09:24.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.878 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:09:24.879 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129 2025-10-06 16:09:24.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '10cde559-a29c-5b7e-a28c-f4565d329aa4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:09:24.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(bridge=br-int, port=tap1511a682-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:09:24.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(table=Port, record=tap1511a682-81, col_values=(('qos', UUID('9be76999-7b87-478b-ab06-071bc9c0cd72')),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:09:24.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(table=Interface, record=tap1511a682-81, col_values=(('external_ids', {'iface-id': '1511a682-810a-45db-8cd9-bcc2c61b241b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:4b:57', 'vm-uuid': '773e8581-e83f-432d-8be8-bb426e08df6c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89 2025-10-06 16:09:24.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:09:24.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:24.900 2 INFO os_vif [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:4b:57,bridge_name='br-int',has_traffic_filtering=True,id=1511a682-810a-45db-8cd9-bcc2c61b241b,network=Network(551ecee8-9a26-40b8-b2fb-1be7f7ae2d07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1511a682-81') 2025-10-06 16:09:24.949 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:09:24.949 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11653 2025-10-06 16:09:24.949 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] No VIF found with MAC fa:16:3e:e7:4b:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11629 2025-10-06 16:09:24.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:25.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:25.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:25.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:25.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:25.920 2 DEBUG nova.compute.manager [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Received event network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:09:25.921 2 DEBUG oslo_concurrency.lockutils [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:25.921 2 DEBUG oslo_concurrency.lockutils [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:25.922 2 DEBUG nova.compute.manager [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Processing event network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10449 2025-10-06 16:09:25.922 2 DEBUG nova.compute.manager [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Received event network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10683 2025-10-06 16:09:25.922 2 DEBUG oslo_concurrency.lockutils [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:25.922 2 DEBUG oslo_concurrency.lockutils [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c-events" released by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:25.922 2 DEBUG nova.compute.manager [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] No waiting events found dispatching network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:319 2025-10-06 16:09:25.922 2 WARNING nova.compute.manager [req-f9d927ac-8a4d-4890-9e5b-299b342422f4 e65f3013aa8145999e5dbadc9d0c678d 5ec2f863d8f54b06a96618816ae879f8 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Received unexpected event network-vif-plugged-1511a682-810a-45db-8cd9-bcc2c61b241b for instance with vm_state building and task_state spawning. 2025-10-06 16:09:25.944 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:09:25.944 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] VM Started (Lifecycle Event) 2025-10-06 16:09:25.956 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4290 2025-10-06 16:09:25.963 2 INFO nova.virt.libvirt.driver [-] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Instance spawned successfully. 2025-10-06 16:09:25.963 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:852 2025-10-06 16:09:25.974 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:09:25.978 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-06 16:09:25.985 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:25.985 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:25.986 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:25.986 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:25.987 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:25.987 2 DEBUG nova.virt.libvirt.driver [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:881 2025-10-06 16:09:26.016 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-06 16:09:26.017 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:09:26.017 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] VM Paused (Lifecycle Event) 2025-10-06 16:09:26.066 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:09:26.075 2 INFO nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Took 2.66 seconds to spawn the instance on the hypervisor. 2025-10-06 16:09:26.076 2 DEBUG nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:09:26.077 2 DEBUG nova.virt.driver [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1612 2025-10-06 16:09:26.078 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] VM Resumed (Lifecycle Event) 2025-10-06 16:09:26.110 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1655 2025-10-06 16:09:26.114 2 DEBUG nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1289 2025-10-06 16:09:26.155 2 INFO nova.compute.manager [req-b512aef6-a09b-46d2-a738-0c5f43e10003 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] During sync_power_state the instance has a pending task (spawning). Skip. 2025-10-06 16:09:26.184 2 INFO nova.compute.manager [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Took 4.68 seconds to build instance. 2025-10-06 16:09:26.206 2 DEBUG oslo_concurrency.lockutils [req-69845de5-7aee-4c69-8bd3-9fe3819326d8 00a0ae66495a4319b62d3402a43653ff 64e40da08e4a44b0890ef4f0195eaaa4 - default default] Lock "773e8581-e83f-432d-8be8-bb426e08df6c" released by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:29.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:29.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:34.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:34.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:39.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:39.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:44.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:44.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:45.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:45.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:09:45.420 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-06 16:09:49.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:49.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:50.342 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:50.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:50.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:50.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-06 16:09:51.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:53.410 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:54.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:54.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:55.412 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:55.432 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:55.433 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:55.433 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:09:55.434 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:56.070 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.636s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:56.169 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:09:56.170 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:09:56.170 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:09:56.173 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:09:56.173 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:09:56.500 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:09:56.502 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3615MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:09:56.502 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:09:56.611 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:09:56.611 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 773e8581-e83f-432d-8be8-bb426e08df6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:09:56.612 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:09:56.612 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:09:56.650 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing inventories for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804 2025-10-06 16:09:56.675 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Updating ProviderTree inventory for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768 2025-10-06 16:09:56.676 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Updating inventory in ProviderTree for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:175 2025-10-06 16:09:56.688 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing aggregate associations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813 2025-10-06 16:09:56.706 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Refreshing trait associations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_F16C,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI2,HW_CPU_X86_SSSE3,COMPUTE_NODE,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825 2025-10-06 16:09:56.707 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:09:57.184 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:09:57.192 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:09:57.209 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:09:57.210 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:09:57.210 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:09:58.198 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:58.224 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:58.224 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:58.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:58.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:09:58.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:09:58.509 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:09:58.509 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:09:58.510 2 DEBUG nova.objects.instance [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:09:59.083 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:09:59.100 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:09:59.100 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:09:59.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:09:59.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:09:59.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:09:59.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:04.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:04.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:07.997 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:08.024 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Triggering sync for uuid 4fd14755-66e2-4403-a5fb-0f03557d253f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:9924 2025-10-06 16:10:08.025 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Triggering sync for uuid 773e8581-e83f-432d-8be8-bb426e08df6c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:9924 2025-10-06 16:10:08.025 2 DEBUG oslo_concurrency.lockutils [-] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:10:08.026 2 DEBUG oslo_concurrency.lockutils [-] Lock "773e8581-e83f-432d-8be8-bb426e08df6c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:10:08.082 2 DEBUG oslo_concurrency.lockutils [-] Lock "4fd14755-66e2-4403-a5fb-0f03557d253f" released by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:10:08.090 2 DEBUG oslo_concurrency.lockutils [-] Lock "773e8581-e83f-432d-8be8-bb426e08df6c" released by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:10:09.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:09.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:14.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:14.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:19.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:19.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:24.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:24.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:29.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:29.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:34.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:34.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:39.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:39.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:44.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:49.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:49.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:50.429 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:51.322 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:54.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:54.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:55.413 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:55.433 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:10:55.433 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:10:55.433 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:10:55.434 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:10:55.905 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:10:55.996 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:10:55.997 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:10:55.997 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:10:56.002 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:10:56.002 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:10:56.303 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:10:56.305 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3768MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:10:56.305 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:10:56.406 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:10:56.406 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 773e8581-e83f-432d-8be8-bb426e08df6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:10:56.407 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:10:56.407 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:10:56.410 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:10:56.847 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:10:56.853 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:10:56.873 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:10:56.873 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:10:56.873 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:10:57.859 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:57.860 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:10:59.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:10:59.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:00.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:00.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:11:00.515 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:11:00.516 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:11:00.965 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating instance_info_cache with network_info: [{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:11:00.984 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:11:00.985 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:11:00.985 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:00.986 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:11:04.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:04.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:09.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:09.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:14.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:14.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:19.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5019 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:11:19.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:24.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:24.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:24.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:11:24.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:25.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:25.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:30.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:30.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:30.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5055 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:11:30.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:30.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:30.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:35.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:40.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:40.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:40.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:11:40.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:40.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:40.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:45.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:45.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:50.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:50.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:11:50.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:11:50.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:50.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:50.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:11:50.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:53.320 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:55.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:11:55.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:56.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:57.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:57.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:57.414 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:11:57.442 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:11:57.443 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:11:57.443 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:11:57.444 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:11:57.916 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:11:57.989 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:11:57.989 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:11:57.990 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:11:57.994 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:11:57.995 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:11:58.178 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:11:58.179 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=3717MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:11:58.180 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:11:58.224 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:11:58.224 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 773e8581-e83f-432d-8be8-bb426e08df6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:11:58.224 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:11:58.224 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:11:58.226 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:11:58.681 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:11:58.688 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:11:58.705 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:11:58.706 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:11:58.706 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:11:59.708 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:00.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:00.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:00.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:00.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:12:00.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:12:00.516 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:12:00.517 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:12:00.517 2 DEBUG nova.objects.instance [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:12:01.030 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:12:01.050 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:12:01.051 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:12:01.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:01.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:12:05.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:05.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:10.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:10.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:15.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:15.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:20.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:20.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:20.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:20.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:20.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:20.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:25.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:25.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:25.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:30.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:30.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:30.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5050 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:30.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:30.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:30.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:35.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:35.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:40.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:40.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:40.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:40.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:40.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:40.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:45.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:45.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:45.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:45.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:45.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:45.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:50.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:50.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:51.402 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:54.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:55.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:55.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:55.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:12:55.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:12:55.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:55.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:12:55.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:12:57.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:58.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:58.413 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:12:58.431 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:12:58.431 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:12:58.432 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:12:58.432 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:12:58.955 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:12:59.019 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:12:59.020 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:12:59.020 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:12:59.023 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:12:59.024 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:12:59.201 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:12:59.202 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=4229MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:12:59.202 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:12:59.262 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:12:59.263 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 773e8581-e83f-432d-8be8-bb426e08df6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:12:59.263 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:12:59.263 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:12:59.265 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:12:59.718 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:12:59.724 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:12:59.736 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:12:59.737 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:12:59.737 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:13:00.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:00.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:00.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:00.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:00.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:00.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:00.738 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:00.738 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:13:00.876 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:13:00.876 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:13:01.435 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating instance_info_cache with network_info: [{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:13:01.457 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:13:01.457 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:13:01.457 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:02.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:02.399 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:13:05.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:10.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:10.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:10.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:10.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:10.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:10.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:15.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:15.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:15.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:15.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:15.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:15.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:20.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:20.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:25.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:25.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:25.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:25.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:25.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:25.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:30.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:30.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:30.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:30.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:30.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:30.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:35.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:40.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:40.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:40.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:40.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:40.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:40.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:45.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:50.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:50.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:50.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:50.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:50.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:50.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:51.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:55.320 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:55.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:55.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:13:55.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:13:55.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:55.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:13:55.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:13:57.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:59.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:59.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:59.414 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:13:59.434 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:13:59.435 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:13:59.435 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:13:59.436 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:14:00.023 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:14:00.106 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:14:00.107 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:14:00.107 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:14:00.112 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:14:00.112 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:14:00.383 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:14:00.385 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=5533MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:14:00.385 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:14:00.476 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 4fd14755-66e2-4403-a5fb-0f03557d253f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:14:00.477 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Instance 773e8581-e83f-432d-8be8-bb426e08df6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1548 2025-10-06 16:14:00.478 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:14:00.478 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:14:00.482 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:14:00.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:00.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:00.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:00.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:00.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:00.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:00.963 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:14:00.970 2 DEBUG nova.compute.provider_tree [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed in ProviderTree for provider: 746a1a56-6a22-48dd-85a9-45922719c8f6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:179 2025-10-06 16:14:00.986 2 DEBUG nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Inventory has not changed for provider 746a1a56-6a22-48dd-85a9-45922719c8f6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 6, 'reserved': 0, 'min_unit': 1, 'max_unit': 6, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940 2025-10-06 16:14:00.987 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Compute_service record updated for standalone.localdomain:standalone.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:994 2025-10-06 16:14:00.988 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:14:02.911 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:02.929 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:02.929 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:14:02.930 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:14:03.136 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:14:03.137 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:14:03.137 2 DEBUG nova.objects.instance [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd14755-66e2-4403-a5fb-0f03557d253f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1098 2025-10-06 16:14:03.655 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updating instance_info_cache with network_info: [{"id": "e6ead84d-622e-4356-93d8-2f0ae5455764", "address": "fa:16:3e:ec:db:8e", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape6ead84d-62", "ovs_interfaceid": "e6ead84d-622e-4356-93d8-2f0ae5455764", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:14:03.671 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-4fd14755-66e2-4403-a5fb-0f03557d253f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:14:03.672 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 4fd14755-66e2-4403-a5fb-0f03557d253f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:14:04.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:04.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:14:05.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:05.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:05.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:05.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:05.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:05.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:10.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:10.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:10.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:10.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:10.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:10.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:15.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:15.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:15.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:15.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:15.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:15.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:21.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:21.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:21.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:21.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:21.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:21.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:26.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:26.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:26.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:26.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:26.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:26.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:31.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:31.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:31.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:31.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:31.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:31.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:36.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:41.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:41.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:41.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:41.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:41.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:41.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:46.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:46.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:46.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:46.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:46.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:46.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:51.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:51.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:51.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:51.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:51.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:51.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:51.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:55.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:56.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:56.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:14:56.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:14:56.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:56.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:14:56.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:14:56.321 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:57.398 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:58.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:58.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:14:58.416 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10789 2025-10-06 16:14:58.416 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:14:58.416 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-06 16:14:59.428 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:00.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:00.415 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:00.439 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:15:00.439 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:15:00.440 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:15:00.440 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:15:00.926 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:15:01.011 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:15:01.012 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:15:01.012 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:15:01.020 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:15:01.021 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:15:01.299 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:15:01.301 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=6939MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:15:01.301 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:15:01.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:01.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:01.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:01.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:01.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:01.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:01.395 2 ERROR nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"} 2025-10-06 16:15:01.396 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:15:01.396 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:15:01.410 2 ERROR nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [req-e76cec2b-1b57-48eb-b816-218c0b7e9c5b] Failed to retrieve resource provider tree from placement API for UUID 746a1a56-6a22-48dd-85a9-45922719c8f6. Got 503: {"message": "The server is currently unavailable. Please try again at a later time.

\nThe Keystone service is temporarily unavailable.\n\n", "code": "503 Service Unavailable", "title": "Service Unavailable"}. 2025-10-06 16:15:01.411 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error updating resources for node standalone.localdomain.: nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager Traceback (most recent call last): 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10167, in _update_available_resource_for_node 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename, 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 886, in update_available_resource 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 360, in inner 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager return f(*args, **kwargs) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 993, in _update_available_resource 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager self._update(context, cn, startup=startup) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1244, in _update 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager return attempt.get(self._wrap_exception) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2]) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager raise value 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1174, in _update_to_placement 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager prov_tree = self.reportclient.get_provider_tree_and_ensure_root( 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 899, in get_provider_tree_and_ensure_root 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager self._ensure_resource_provider( 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 688, in _ensure_resource_provider 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager rps_to_refresh = self.get_providers_in_tree(context, uuid) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 551, in get_providers_in_tree 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid) 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 16:15:01.412 2 ERROR nova.compute.manager 2025-10-06 16:15:01.423 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:04.433 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:04.433 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:15:04.566 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Acquired lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:266 2025-10-06 16:15:04.567 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:1866 2025-10-06 16:15:05.078 2 DEBUG nova.network.neutron [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updating instance_info_cache with network_info: [{"id": "1511a682-810a-45db-8cd9-bcc2c61b241b", "address": "fa:16:3e:e7:4b:57", "network": {"id": "551ecee8-9a26-40b8-b2fb-1be7f7ae2d07", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "64e40da08e4a44b0890ef4f0195eaaa4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap1511a682-81", "ovs_interfaceid": "1511a682-810a-45db-8cd9-bcc2c61b241b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116 2025-10-06 16:15:05.095 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Releasing lock "refresh_cache-773e8581-e83f-432d-8be8-bb426e08df6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:282 2025-10-06 16:15:05.096 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [instance: 773e8581-e83f-432d-8be8-bb426e08df6c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9585 2025-10-06 16:15:05.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:05.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:15:06.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:06.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:06.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:06.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:06.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:06.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:11.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:11.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:11.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:11.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:11.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:11.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:16.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:16.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:16.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:16.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:16.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:16.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:21.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:21.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:21.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:21.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:21.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:21.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:26.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:26.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:26.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:26.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:26.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:26.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:31.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:31.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:31.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:31.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:31.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:31.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:36.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:36.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:36.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:36.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:36.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:36.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:41.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:41.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:41.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:41.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:41.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:41.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:46.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:46.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:46.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:46.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:46.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:46.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:51.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:51.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:51.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:51.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:51.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:51.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:53.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:55.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:56.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:56.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:15:56.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:15:56.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:56.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:15:56.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:15:57.320 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:57.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:15:59.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:16:01.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:16:01.419 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:16:01.420 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:16:01.420 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Auditing locally available compute resources for standalone.localdomain (node: standalone.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:858 2025-10-06 16:16:01.420 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384 2025-10-06 16:16:01.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:01.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:01.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:16:01.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:01.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:01.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:01.928 2 DEBUG oslo_concurrency.processutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422 2025-10-06 16:16:02.025 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:16:02.025 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:16:02.026 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:16:02.031 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:16:02.031 2 DEBUG nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10768 2025-10-06 16:16:02.291 2 WARNING nova.virt.libvirt.driver [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2025-10-06 16:16:02.292 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Hypervisor/Node resource view: name=standalone.localdomain free_ram=8985MB free_disk=6.949642181396484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1033 2025-10-06 16:16:02.292 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:355 2025-10-06 16:16:05.356 2 ERROR nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6:

503 Service Unavailable

No server is available to handle this request. : nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 746a1a56-6a22-48dd-85a9-45922719c8f6:

503 Service Unavailable

2025-10-06 16:16:05.357 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1056 2025-10-06 16:16:05.358 2 DEBUG nova.compute.resource_tracker [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Final resource view: name=standalone.localdomain phys_ram=15738MB used_ram=1536MB phys_disk=6GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1065 2025-10-06 16:16:06.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:07.369 2 ERROR nova.scheduler.client.report [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] [None] Failed to retrieve resource provider tree from placement API for UUID 746a1a56-6a22-48dd-85a9-45922719c8f6. Got 503:

503 Service Unavailable

No server is available to handle this request. . 2025-10-06 16:16:07.369 2 DEBUG oslo_concurrency.lockutils [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:367 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error updating resources for node standalone.localdomain.: nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager Traceback (most recent call last): 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10167, in _update_available_resource_for_node 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename, 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 886, in update_available_resource 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 360, in inner 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager return f(*args, **kwargs) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 993, in _update_available_resource 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager self._update(context, cn, startup=startup) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1244, in _update 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager return attempt.get(self._wrap_exception) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2]) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager raise value 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1174, in _update_to_placement 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager prov_tree = self.reportclient.get_provider_tree_and_ensure_root( 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 899, in get_provider_tree_and_ensure_root 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager self._ensure_resource_provider( 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 688, in _ensure_resource_provider 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager rps_to_refresh = self.get_providers_in_tree(context, uuid) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 551, in get_providers_in_tree 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager raise exception.ResourceProviderRetrievalFailed(uuid=uuid) 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager nova.exception.ResourceProviderRetrievalFailed: Failed to get resource provider with UUID 746a1a56-6a22-48dd-85a9-45922719c8f6 2025-10-06 16:16:07.370 2 ERROR nova.compute.manager 2025-10-06 16:16:08.372 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:16:11.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:11.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:11.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:16:11.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:11.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:11.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:16.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:16.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:16.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:16:16.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:16.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:16.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:21.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:21.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:21.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:16:21.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:21.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:21.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:16:26.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:26.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:31.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:31.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:36.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:36.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:41.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:46.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:16:52.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:16:57.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:02.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:02.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:04.138 2 WARNING nova.servicegroup.drivers.db [-] Lost connection to nova-conductor for reporting service status.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9b55e57d1d914e669b3519ee6c71ad2c 2025-10-06 16:17:04.143 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.01 sec 2025-10-06 16:17:07.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:07.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1e6c158c78b247818fb6c8a959484149 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 1e6c158c78b247818fb6c8a959484149 2025-10-06 16:17:08.376 2 ERROR oslo_service.periodic_task 2025-10-06 16:17:08.382 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:17:08.382 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:17:08.382 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:17:12.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:12.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:17.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:22.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:27.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:32.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:37.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:42.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:47.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:52.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:17:57.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:02.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:04.147 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:18:07.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:07.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:07.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:07.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:07.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:07.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 815818fee937474d9bee4affc988d164 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host( 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 815818fee937474d9bee4affc988d164 2025-10-06 16:18:08.384 2 ERROR oslo_service.periodic_task 2025-10-06 16:18:08.386 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:18:08.387 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:18:08.387 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:18:08.387 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:18:12.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:12.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:12.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:12.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:12.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:12.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:17.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:17.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:17.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:17.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:17.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:17.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:22.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:22.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:22.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:22.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:27.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:27.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:27.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:27.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:27.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:27.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:32.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:32.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:32.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:32.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:32.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:32.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:37.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:37.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:37.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:37.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:37.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:37.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:42.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:42.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:42.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:42.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:42.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:47.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:47.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:47.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:47.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:47.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:47.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:52.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:52.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:18:52.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:18:52.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:52.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:18:52.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:18:57.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:02.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:04.167 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.02 sec 2025-10-06 16:19:07.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:07.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9abda755ca9a4b7aa945a25ee6e230c4 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 9abda755ca9a4b7aa945a25ee6e230c4 2025-10-06 16:19:08.392 2 ERROR oslo_service.periodic_task 2025-10-06 16:19:08.394 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:19:08.394 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:19:08.394 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:19:08.395 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:19:12.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:12.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:12.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:12.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:12.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:12.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:17.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:17.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:17.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:17.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:17.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:17.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:22.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:22.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:22.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:22.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:22.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:22.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:27.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:27.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:27.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:27.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:27.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:27.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:32.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:32.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:37.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:37.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:37.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:37.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:37.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:37.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:42.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:42.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:47.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:47.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:52.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:52.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:52.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:52.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:52.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:52.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:57.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:57.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:19:57.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:19:57.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:19:57.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:19:57.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:02.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:02.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:02.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:02.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:02.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:02.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:04.170 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:20:07.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 02292e74ba784a8d8177abee98847abd 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 02292e74ba784a8d8177abee98847abd 2025-10-06 16:20:08.399 2 ERROR oslo_service.periodic_task 2025-10-06 16:20:08.401 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:20:08.401 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:20:12.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:12.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:12.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:12.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:12.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:12.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:17.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:17.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:17.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:17.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:17.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:17.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:22.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:22.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:22.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:22.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:22.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:22.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:27.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:27.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:27.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:27.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:27.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:27.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:32.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:32.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:32.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:32.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:32.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:32.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:37.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:42.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:42.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:42.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:42.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:42.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:42.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:47.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:47.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:47.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:47.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:48.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:48.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:52.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:53.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:58.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:58.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:20:58.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:20:58.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:20:58.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:20:58.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:03.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:03.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:03.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:21:03.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:03.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:03.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:04.174 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:21:08.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:08.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:08.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:21:08.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:08.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:08.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f61c8bae41264bf1a46f69a6f03dd665 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters( 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f61c8bae41264bf1a46f69a6f03dd665 2025-10-06 16:21:08.405 2 ERROR oslo_service.periodic_task 2025-10-06 16:21:08.406 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:21:08.407 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-06 16:21:12.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:13.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:17.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:18.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:22.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:23.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:27.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:28.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:32.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:33.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:38.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:38.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:38.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:21:38.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:38.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:38.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:42.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:43.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:47.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:48.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:53.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:21:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:21:53.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:53.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:58.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:58.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:21:58.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:21:58.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:58.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:21:58.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:02.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:03.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:04.178 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:22:07.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bbf76b73e98f4c6bba4674252b930e24 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context, 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bbf76b73e98f4c6bba4674252b930e24 2025-10-06 16:22:08.410 2 ERROR oslo_service.periodic_task 2025-10-06 16:22:08.411 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:22:08.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:12.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:13.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:17.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:18.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:22.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:23.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:28.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:28.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:22:28.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:28.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:28.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:32.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:33.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:37.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:38.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:43.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:43.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:43.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:22:43.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:43.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:43.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:48.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:48.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:48.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:22:48.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:48.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:48.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:53.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:53.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:22:53.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:22:53.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:53.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:53.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:22:57.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:22:58.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:02.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:03.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:04.182 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:23:07.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6891cecb15894203b1940fa57d1925a4 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context) 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6891cecb15894203b1940fa57d1925a4 2025-10-06 16:23:08.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:23:09.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:12.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:14.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:17.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:19.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:22.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:24.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:27.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:29.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:34.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:23:34.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:23:34.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:23:34.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:23:34.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:34.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:23:37.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:39.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:44.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:23:44.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:23:44.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:23:44.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:23:44.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:23:44.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:47.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:49.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:53.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:54.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:58.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:23:59.366 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:23:59.367 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:23:59.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:03.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:04.187 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:24:04.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:08.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:09.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:13.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:14.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:18.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:19.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:23.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:24.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:29.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:29.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:29.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:29.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:34.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:34.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:34.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:34.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:34.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:34.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:39.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:39.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:39.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:39.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:39.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:39.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:44.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:44.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:44.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:44.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:44.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:44.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:49.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:49.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:49.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:49.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:49.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:49.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:54.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:54.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:54.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:54.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3b1c5be6cc804463ab5915838becac39 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 3b1c5be6cc804463ab5915838becac39 2025-10-06 16:24:59.374 2 ERROR oslo_service.periodic_task 2025-10-06 16:24:59.375 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:24:59.376 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:24:59.376 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:24:59.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:59.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:24:59.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:24:59.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:24:59.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:24:59.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:25:04.190 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:25:04.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:25:08.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:09.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:13.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:14.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:18.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:20.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:23.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:25.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:28.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:30.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:33.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:35.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:38.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:40.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:43.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:45.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:50.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:25:50.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:25:50.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:25:50.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:25:50.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:25:50.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:25:55.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f22856fe1d8242aab8649ccae9f09258 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host( 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID f22856fe1d8242aab8649ccae9f09258 2025-10-06 16:25:59.383 2 ERROR oslo_service.periodic_task 2025-10-06 16:25:59.384 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:25:59.385 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:25:59.385 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:25:59.385 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:26:00.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:26:00.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:00.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:26:00.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:26:00.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:26:00.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:03.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:04.194 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:26:05.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:08.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:10.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:13.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:15.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:18.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:20.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:23.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:25.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:30.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:33.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:35.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:38.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:40.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:43.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:45.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:48.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:50.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:53.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:55.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:58.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 09894aa9d3b544b69b2280f432f0957d 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 09894aa9d3b544b69b2280f432f0957d 2025-10-06 16:26:59.388 2 ERROR oslo_service.periodic_task 2025-10-06 16:26:59.390 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:26:59.391 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:27:00.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:04.197 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:27:05.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:27:05.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:05.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:27:05.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:27:05.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:27:05.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:10.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:13.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:15.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:18.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:20.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:23.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:25.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:28.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:30.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:33.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:35.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:38.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:40.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:43.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:45.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:48.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:50.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:55.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6c07e04fd9774242b77909e79884137b 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6c07e04fd9774242b77909e79884137b 2025-10-06 16:27:59.397 2 ERROR oslo_service.periodic_task 2025-10-06 16:27:59.399 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:27:59.400 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:27:59.400 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:28:00.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:04.201 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:28:05.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:10.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:28:10.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:10.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:28:10.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:28:10.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:28:10.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:13.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:15.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:20.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:23.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:25.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:30.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:35.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:40.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:45.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:50.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:55.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:28:55.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:55.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:28:55.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:28:55.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:28:55.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6dbdc6ebbb9341d4b1581075519fdf15 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6dbdc6ebbb9341d4b1581075519fdf15 2025-10-06 16:28:59.409 2 ERROR oslo_service.periodic_task 2025-10-06 16:28:59.411 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:28:59.411 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:29:00.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:29:03.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:04.204 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:29:05.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:08.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:10.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:15.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:20.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:29:20.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:29:20.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:29:20.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:29:20.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:20.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:29:23.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:25.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:28.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:30.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:33.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:35.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:38.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:40.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:45.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:29:45.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:29:45.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:29:45.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:29:45.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:29:48.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:50.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:53.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:55.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:58.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._run_pending_deletes: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bf0b84641a564aa1b640523c748496a7 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10787, in _run_pending_deletes 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_filters( 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID bf0b84641a564aa1b640523c748496a7 2025-10-06 16:29:59.415 2 ERROR oslo_service.periodic_task 2025-10-06 16:29:59.417 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:29:59.418 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:10818 2025-10-06 16:30:00.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:03.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:04.209 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:30:06.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:08.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:11.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:16.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:30:16.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:30:16.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:30:16.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:30:16.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:16.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:30:18.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:21.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:23.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:26.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:28.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:31.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:33.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:36.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:41.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:30:41.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:30:41.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:30:41.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:30:41.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:41.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:30:43.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:46.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:48.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:51.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:54.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:56.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:59.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._cleanup_incomplete_migrations: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 7c2de86774744ebb87b992c3977da7c3 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10821, in _cleanup_incomplete_migrations 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task migrations = objects.MigrationList.get_by_filters(context, 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 7c2de86774744ebb87b992c3977da7c3 2025-10-06 16:30:59.421 2 ERROR oslo_service.periodic_task 2025-10-06 16:30:59.423 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:31:01.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:04.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:04.212 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:31:06.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:09.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:11.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:16.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:16.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:16.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:31:16.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:16.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:16.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:21.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:21.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:21.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:31:21.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:21.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:21.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:24.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:26.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:29.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:31.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:34.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:36.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:39.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:41.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:44.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:46.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:49.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:51.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:56.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:56.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:31:56.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:31:56.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:56.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:56.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:31:59.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._cleanup_expired_console_auth_tokens: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0526a2ae41444055b599ecef9a77c1da 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10917, in _cleanup_expired_console_auth_tokens 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task objects.ConsoleAuthToken.clean_expired_console_auths(context) 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 0526a2ae41444055b599ecef9a77c1da 2025-10-06 16:31:59.426 2 ERROR oslo_service.periodic_task 2025-10-06 16:32:01.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:04.216 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:32:04.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:06.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:09.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:12.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:17.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:17.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:17.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:32:17.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:17.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:17.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:22.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:22.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:22.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:32:22.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:22.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:22.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:27.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:27.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:27.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:32:27.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:27.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:27.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:32.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:32.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:32:32.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:32:32.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:32.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:32:32.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:34.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:37.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:39.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:42.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:44.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:47.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:49.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:52.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:54.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:57.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:32:59.431 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:32:59.431 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:32:59.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:02.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:04.221 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:33:04.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:07.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:09.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:12.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:17.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:19.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:22.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:24.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:27.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:32.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:37.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:33:37.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:37.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:33:37.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:33:37.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:33:37.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:33:39.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:42.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:44.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:47.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:49.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:52.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:54.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:57.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._sync_scheduler_instance_info: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6832c0d4220b4b6cb4065748bf916d5b 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2137, in _sync_scheduler_instance_info 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6832c0d4220b4b6cb4065748bf916d5b 2025-10-06 16:33:59.435 2 ERROR oslo_service.periodic_task 2025-10-06 16:33:59.438 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:33:59.438 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9514 2025-10-06 16:33:59.438 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9518 2025-10-06 16:33:59.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:02.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:04.224 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:34:04.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:07.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:09.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:12.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:14.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:17.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:22.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:34:24.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:27.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:32.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:37.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:42.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:44.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:47.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:52.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:34:52.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:34:52.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:34:52.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:34:52.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:52.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:34:54.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:57.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._heal_instance_info_cache: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 547106edb5674ee9925292dad5c6e8bd 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9519, in _heal_instance_info_cache 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host( 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 547106edb5674ee9925292dad5c6e8bd 2025-10-06 16:34:59.442 2 ERROR oslo_service.periodic_task 2025-10-06 16:34:59.444 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:34:59.444 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:34:59.445 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:34:59.445 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:34:59.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:02.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:04.227 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:35:07.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:07.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:07.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:07.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:07.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:07.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:09.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:12.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:17.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:17.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:17.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:17.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:17.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:17.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:19.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:22.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:27.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:27.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:27.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:27.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:27.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:27.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:32.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:32.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:32.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:32.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:32.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:32.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:37.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:37.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:37.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:37.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:37.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:37.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:42.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:42.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:42.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:42.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:42.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:42.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:44.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:47.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:52.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:52.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:35:52.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:35:52.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:52.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:52.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:35:54.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:57.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._instance_usage_audit: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a59cb856c1ad4925af66b5ce9ba5e58d 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9768, in _instance_usage_audit 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task if objects.TaskLog.get(context, 'instance_usage_audit', begin, end, 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return fn.__get__(None, obj)(*args, **kwargs) 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID a59cb856c1ad4925af66b5ce9ba5e58d 2025-10-06 16:35:59.448 2 ERROR oslo_service.periodic_task 2025-10-06 16:35:59.450 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:35:59.451 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:35:59.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:02.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:04.231 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:36:04.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:08.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:10.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:13.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:18.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:18.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:18.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:18.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:18.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:18.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:23.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:23.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:23.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:23.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:23.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:23.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:28.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:28.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:28.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:28.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:28.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:28.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:33.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:33.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:33.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:33.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:33.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:33.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:35.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:38.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:43.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:43.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:43.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:43.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:43.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:43.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:48.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:48.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:48.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:48.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:48.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:48.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:53.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:53.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:53.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:53.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:53.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:53.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:58.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:58.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:36:58.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:36:58.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:58.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:36:58.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._sync_power_states: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6efbafd66add408baa4ed6fafd9a5077 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 9878, in _sync_power_states 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task db_instances = objects.InstanceList.get_by_host(context, self.host, 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 6efbafd66add408baa4ed6fafd9a5077 2025-10-06 16:36:59.454 2 ERROR oslo_service.periodic_task 2025-10-06 16:36:59.455 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:36:59.455 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10131 2025-10-06 16:36:59.456 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:37:00.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:03.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:04.235 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:37:05.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:08.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:10.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:13.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:18.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:20.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:23.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:25.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:28.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:30.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:33.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:35.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:38.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:40.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:43.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:45.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:48.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:53.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:37:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:37:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:37:53.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:37:53.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:53.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:37:55.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:58.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager.update_available_resource: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b5144c2fada149ceb4e7f11b929a1400 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10225, in update_available_resource 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task compute_nodes_in_db = self._get_compute_nodes_in_db(context, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10272, in _get_compute_nodes_in_db 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return objects.ComputeNodeList.get_all_by_host(context, self.host, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID b5144c2fada149ceb4e7f11b929a1400 2025-10-06 16:37:59.460 2 ERROR oslo_service.periodic_task 2025-10-06 16:37:59.462 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:38:00.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:03.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:04.238 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:38:05.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:08.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:10.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:13.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:14.780 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer 2025-10-06 16:38:14.782 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 104] Connection reset by peer. Trying again in 1 seconds.: ConnectionResetError: [Errno 104] Connection reset by peer 2025-10-06 16:38:15.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:15.796 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:15.823 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:15.844 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer 2025-10-06 16:38:15.860 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:16.873 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:17.820 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:17.835 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:19.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:19.895 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:20.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:21.840 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:21.855 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:22.487 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer 2025-10-06 16:38:23.404 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 104] Connection reset by peer 2025-10-06 16:38:23.418 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 1.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:24.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:24.434 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 3.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:24.909 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:25.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:27.452 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 5.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:27.860 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:27.874 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:29.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:30.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:31.931 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:32.466 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 7.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:34.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:35.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:35.874 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:35.886 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:39.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:39.488 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 9.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:39.524 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:38:40.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:40.953 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:44.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:45.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:45.898 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:45.913 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:48.507 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 11.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:49.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:50.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:51.974 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:54.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:55.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:56.561 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:38:57.926 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:57.947 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:38:59.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Error during ComputeManager._cleanup_running_deleted_instances: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c87166832c75425b8e92acef780cb8c8 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 433, in get 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return self._queues[msg_id].get(block=True, timeout=timeout) 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 322, in get 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return waiter.wait() 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/queue.py", line 141, in wait 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return get_hub().switch() 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return self.greenlet.switch() 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task _queue.Empty 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task During handling of the above exception, another exception occurred: 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task Traceback (most recent call last): 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_service/periodic_task.py", line 216, in run_periodic_tasks 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task task(self, context) 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10323, in _cleanup_running_deleted_instances 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task instances = self._running_deleted_instances(context) 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10378, in _running_deleted_instances 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task instances = self._get_instances_on_driver(context, filters) 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 656, in _get_instances_on_driver 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task local_instances = objects.InstanceList.get_by_filters( 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task result = cls.indirection_api.object_class_action_versions( 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return cctxt.call(context, 'object_class_action_versions', 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 175, in call 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task self.transport._send(self.target, msg_ctxt, msg, 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return self._driver.send(target, ctxt, message, 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in send 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task return self._send(target, ctxt, message, wait_for_reply, timeout, 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 670, in _send 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task result = self._waiter.wait(msg_id, timeout, 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 559, in wait 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task message = self.waiters.get(msg_id, timeout=timeout) 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 435, in get 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task raise oslo_messaging.MessagingTimeout( 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID c87166832c75425b8e92acef780cb8c8 2025-10-06 16:38:59.467 2 ERROR oslo_service.periodic_task 2025-10-06 16:38:59.469 2 DEBUG oslo_service.periodic_task [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210 2025-10-06 16:38:59.469 2 DEBUG nova.compute.manager [req-95caff67-bb31-4d5e-a82e-ca2f2bbc3192 - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10780 2025-10-06 16:38:59.536 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 13.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:00.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:04.242 2 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 50.00 sec 2025-10-06 16:39:04.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:04.991 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:05.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:09.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:11.956 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:11.974 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:12.563 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 15.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:13.597 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:39:14.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:14.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:14.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:39:14.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:14.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:14.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:15.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:19.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:20.015 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:24.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:24.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:24.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:39:24.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:24.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:24.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:25.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:27.598 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 17.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:27.985 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:28.000 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:29.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:30.646 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:39:34.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:34.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:39:34.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:39:34.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:34.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:34.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:39:35.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:37.043 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:39.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:40.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:44.631 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 19.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:44.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:45.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:46.014 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:46.029 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:47.683 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:39:49.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:50.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:54.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:55.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:39:56.072 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:39:59.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:00.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:03.661 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 21.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:04.730 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:40:04.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:05.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:06.049 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:06.065 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:09.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:14.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:14.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:14.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:40:14.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:14.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:14.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:15.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:17.105 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:20.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:20.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:21.771 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:40:24.700 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 23.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:25.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:25.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:28.086 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:28.099 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:30.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:35.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:35.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:35.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:40:35.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:35.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:35.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:35.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:38.811 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:40:40.132 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:40.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:45.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:45.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:45.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:40:45.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:45.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:45.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:45.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:47.737 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 25.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:50.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:52.117 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:52.126 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:40:55.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:40:55.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:55.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:40:55.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:55.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:40:55.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:55.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:40:55.857 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:41:00.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:00.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:05.164 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 27.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:05.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:05.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:10.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:12.783 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 27.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:12.888 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:41:15.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:41:15.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:41:15.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:41:15.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:41:15.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:41:15.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:15.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:18.155 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:18.169 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:20.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:25.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:41:25.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:29.919 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:41:30.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:30.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:32.197 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 29.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:35.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:35.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:39.832 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 29.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:40.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:45.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:45.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:46.201 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:46.218 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:41:46.966 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:41:50.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:55.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:41:55.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:55.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:41:55.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:41:55.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:41:55.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:41:55.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:00.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:00.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:01.240 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:04.017 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:42:05.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:05.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:08.880 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:10.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:15.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:42:15.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:16.246 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:16.261 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:20.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:21.054 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:42:25.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:25.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:30.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:32.282 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:35.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:35.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:38.087 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:42:39.933 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:40.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:45.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:42:45.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:48.288 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:48.298 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:42:50.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:42:55.123 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:42:55.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:42:55.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:00.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:00.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:03.330 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:05.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:05.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:10.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:10.992 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:12.165 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:43:15.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:43:15.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:20.363 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:20.364 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:20.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:25.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:25.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:29.192 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:43:30.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:34.391 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:35.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:43:35.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:40.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:40.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:42.055 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:45.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:45.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:46.228 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:43:50.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:52.426 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:52.428 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: . Trying again in 1 seconds.: amqp.exceptions.RecoverableConnectionError: 2025-10-06 16:43:53.445 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:55.457 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:43:55.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:43:55.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:43:55.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:43:55.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:43:55.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:55.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:43:55.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:43:59.479 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:00.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:00.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:03.267 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:44:05.467 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:05.498 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:05.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:05.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:10.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:13.105 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:13.517 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:15.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:15.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:15.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:44:15.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:15.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:15.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:20.311 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:44:20.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:23.540 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:24.462 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:25.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:44:25.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:25.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:25.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:30.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:30.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:30.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:44:30.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:30.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:30.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:35.562 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:35.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:36.501 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:37.357 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:44:40.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:40.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:40.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:44:40.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:40.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:40.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:44.158 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:45.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:49.585 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:44:50.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:50.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:50.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:44:50.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:50.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:50.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:44:54.398 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:44:55.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:44:55.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:44:56.503 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:00.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:00.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:00.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:00.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:00.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:00.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:05.611 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:05.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:05.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:05.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:05.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:05.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:05.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:07.537 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:10.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:10.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:10.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:10.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:10.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:10.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:11.447 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:45:15.199 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:15.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:15.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:20.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:20.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:20.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:20.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:21.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:21.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:23.638 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:25.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:26.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:28.487 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:45:28.573 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:31.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:31.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:31.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:31.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:31.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:31.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:35.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:36.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:38.577 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:41.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:41.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:45:41.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:45:41.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:41.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:41.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:45:43.667 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:45.524 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:45:45.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:46.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:46.248 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:45:50.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:51.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:55.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:45:56.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:00.612 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:00.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:01.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:02.577 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:46:05.689 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:05.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:06.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:09.620 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:11.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:11.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:11.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:46:11.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:11.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:11.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:15.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:16.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:17.300 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:19.624 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:46:21.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:21.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:21.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:46:21.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:21.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:21.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:25.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:26.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:29.721 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:31.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:31.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:31.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:46:31.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:31.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:31.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:32.660 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:35.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:36.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:36.667 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:46:40.672 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:41.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:41.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:41.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:46:41.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:41.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:41.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:45.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:46.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:48.356 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:51.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:51.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:46:51.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:46:51.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:51.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:51.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:46:53.710 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:46:55.748 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:46:55.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:46:56.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:00.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:01.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:04.754 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:05.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:06.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:10.755 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:47:11.713 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:11.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:11.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:11.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:47:11.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:11.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:11.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:15.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:16.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:19.406 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:21.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:21.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:21.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:47:21.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:21.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:21.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:23.789 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:25.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:27.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:27.798 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:47:32.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:32.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:32.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:47:32.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:32.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:32.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:35.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:36.803 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:37.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:42.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:42.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:42.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:47:42.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:42.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:42.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:42.754 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:44.843 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:47:45.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:47.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:50.449 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:52.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:52.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:47:52.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:47:52.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:52.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:52.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:47:53.842 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:47:55.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:47:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:00.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:01.883 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:48:02.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:05.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:07.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:08.841 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:12.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:12.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:12.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:48:12.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:12.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:12.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:13.796 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:15.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:17.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:18.927 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:48:21.500 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:22.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:22.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:22.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:48:22.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:22.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:22.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:25.876 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:25.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:27.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:32.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:32.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:32.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:48:32.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:32.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:32.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:35.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:36.049 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:48:37.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:40.879 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:42.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:42.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:42.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5052 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:48:42.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:42.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:42.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:44.835 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:45.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:47.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:52.576 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:48:52.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:52.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:48:52.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:48:52.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:52.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:52.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:48:53.087 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:48:55.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:57.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:48:57.913 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:00.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:02.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:05.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:07.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:10.131 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:49:12.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:12.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:12.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:49:12.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:12.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:12.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:12.913 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:15.875 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:15.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:17.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:22.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:22.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:22.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:49:22.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:22.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:22.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:23.631 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:25.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:27.174 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:49:28.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:29.952 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:33.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:33.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:33.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:49:33.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:33.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:33.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:36.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:38.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:41.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:43.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:44.216 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:49:44.956 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:46.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:46.914 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:48.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:53.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:53.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:49:53.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:49:53.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:53.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:53.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:49:54.683 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:49:56.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:49:58.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:01.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:01.261 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:50:01.988 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:03.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:06.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:08.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:11.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:13.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:16.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:16.985 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:17.950 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:18.296 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:50:18.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:23.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:23.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:23.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:50:23.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:23.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:23.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:25.734 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:26.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:28.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:33.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:34.022 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:35.335 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:50:36.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:38.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:43.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:43.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:43.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:50:43.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:43.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:43.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:46.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:48.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:48.990 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:49.019 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: . Trying again in 1 seconds.: amqp.exceptions.RecoverableConnectionError: 2025-10-06 16:50:50.034 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 2 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:52.050 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 4 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:52.375 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:50:53.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:53.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:50:53.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:50:53.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:53.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:53.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:50:56.068 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 6 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:56.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:50:56.786 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:50:58.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:01.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:02.088 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 8 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:03.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:06.059 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:06.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:08.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:09.415 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:51:10.107 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 10 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:13.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:51:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:13.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:13.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:16.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:18.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:20.032 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:20.130 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 12 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:23.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:23.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:23.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:51:23.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:23.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:23.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:26.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:26.460 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:51:27.836 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:28.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:32.158 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 14 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:33.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:33.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:33.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:51:33.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:33.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:33.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:36.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:38.099 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:38.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:43.501 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:51:43.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:43.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:43.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:51:43.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:44.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:44.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:46.179 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 16 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:46.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:49.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:51.071 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:54.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:54.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:51:54.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:51:54.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:54.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:54.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:51:56.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:51:58.882 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:51:59.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:00.541 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:52:01.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:02.198 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 18 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:04.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:06.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:09.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:10.130 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:14.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:14.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:14.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:52:14.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:14.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:14.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:16.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:17.582 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:52:19.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:20.226 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 20 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:22.110 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:24.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:24.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:24.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:52:24.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:24.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:24.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:26.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:29.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:29.937 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:34.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:34.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:34.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:52:34.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:34.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:34.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:34.624 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:52:36.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:39.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:40.257 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 22 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:41.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:42.166 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:44.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:46.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:49.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:51.672 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:52:53.152 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:52:54.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:54.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:52:54.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:52:54.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:54.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:54.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:52:56.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:52:59.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:00.987 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:01.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:02.293 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 24 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:04.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:06.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:08.715 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:53:09.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:14.205 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:14.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:14.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:14.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:53:14.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:14.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:14.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:16.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:19.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:24.188 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:24.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:24.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:24.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:53:24.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:24.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:24.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:25.761 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:53:26.326 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 26 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:26.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:29.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:32.035 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:34.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:34.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:34.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:53:34.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:34.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:34.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:36.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:39.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:42.808 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:53:44.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:44.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:44.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:53:44.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:44.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:44.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:46.241 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:46.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:50.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:52.359 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 28 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:55.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:55.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:53:55.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:53:55.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:55.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:55.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:53:55.227 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:53:56.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:53:59.848 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:54:00.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:01.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:03.087 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:05.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:06.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:10.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:15.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:15.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:15.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:54:15.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:15.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:15.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:16.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:16.888 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:54:18.275 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:20.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:20.395 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 30 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:25.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:25.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:25.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:54:25.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:25.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:25.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:26.267 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:26.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:30.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:33.933 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:54:34.145 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:35.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:36.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:40.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:45.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:45.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:45.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:54:45.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:45.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:45.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:54:46.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:50.313 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:50.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:50.428 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:54:50.979 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:54:55.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:54:56.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:54:57.315 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:00.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:01.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:05.198 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:05.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:06.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:08.024 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:55:10.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:15.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:15.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:15.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:55:15.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:55:15.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:15.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:55:16.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:20.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:22.373 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:22.462 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:25.071 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:55:25.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:26.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:28.379 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:30.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:35.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:36.247 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:36.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:40.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:41.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:42.108 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:55:45.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:46.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:50.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:54.413 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:54.507 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:55:55.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:55.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:55:55.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:55:55.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:55:55.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:55.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:55:56.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:55:59.132 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:55:59.417 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:00.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:02.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:05.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:07.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:07.294 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:10.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:12.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:15.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:16.169 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:56:17.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:20.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:25.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:25.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:25.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:56:25.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:25.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:25.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:26.451 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:26.543 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:27.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:30.453 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:30.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:33.239 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:56:35.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:37.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:38.342 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:41.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:46.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:46.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:46.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:56:46.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:46.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:46.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:47.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:50.282 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:56:51.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:56.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:56.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:56:56.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:56:56.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:56.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:56.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:56:57.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:56:58.487 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:56:58.582 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:01.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:01.490 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:02.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:06.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:07.326 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:57:07.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:09.388 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:16.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:16.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:16.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:57:16.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:16.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:16.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:17.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:21.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:24.372 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:57:26.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:26.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:26.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:57:26.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:26.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:26.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:27.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:30.530 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:30.623 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:31.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:32.530 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:32.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:36.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:37.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:40.435 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:57:41.418 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:57:41.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:42.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:46.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:51.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:51.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:57:51.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:57:51.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:51.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:51.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:57:52.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:56.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:57:58.464 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:58:01.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:01.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:01.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:58:01.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:01.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:01.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:02.598 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:02.667 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:02.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:03.566 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:06.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:07.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:11.509 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:11.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:12.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:15.506 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:58:16.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:21.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:21.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:21.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:58:21.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:21.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:21.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:22.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:26.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:31.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:31.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:31.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:58:31.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:31.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:31.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:32.547 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:58:32.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:34.598 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:34.631 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:34.699 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:36.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:37.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:41.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:42.586 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:58:42.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:47.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:49.601 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:58:52.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:52.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:58:52.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:58:52.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:52.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:52.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:58:52.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:57.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:58:57.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:02.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:02.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:05.648 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:06.648 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:59:06.675 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:06.736 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:07.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:07.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:12.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:12.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:13.659 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:17.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:22.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:59:22.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:59:22.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:59:22.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:59:22.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:22.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:59:22.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:23.691 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:59:27.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:27.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:32.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:32.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:36.700 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:37.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:37.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:38.733 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:38.770 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:40.735 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 16:59:42.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:44.716 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 16:59:47.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:52.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:59:52.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 16:59:52.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 16:59:52.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:59:52.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:52.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 16:59:52.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:57.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 16:59:57.780 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:00:02.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:02.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:02.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:00:02.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:02.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:02.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:02.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:07.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:07.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:07.762 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:10.787 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:10.814 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:12.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:12.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:14.823 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:00:15.774 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:17.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:17.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:17.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5022 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:00:17.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:17.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:17.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:22.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:22.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:22.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:00:22.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:22.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:22.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:27.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:27.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:27.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:00:27.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:27.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:27.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:00:31.862 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:00:32.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:32.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:37.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:37.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:38.825 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:42.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:42.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:42.829 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:42.868 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:46.833 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:00:47.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:00:48.910 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:00:52.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:52.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:00:57.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:02.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:02.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:05.954 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:01:07.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:07.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:09.871 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:12.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:12.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:14.877 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:14.899 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:17.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:17.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:17.881 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:22.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:22.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:22.990 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:01:27.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:27.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:32.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:32.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:37.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:37.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:40.026 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:01:40.917 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:42.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:42.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:46.930 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:46.944 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:47.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:47.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:48.928 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:01:52.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:52.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:57.066 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:01:57.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:01:57.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:01:57.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:01:57.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:01:57.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:01:57.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:02.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:02.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:02.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:02.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:02.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:02.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:07.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:07.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:07.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:07.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:07.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:07.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:11.970 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:12.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:12.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:12.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:12.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:12.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:12.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:14.108 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:02:17.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:17.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:17.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:17.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:18.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:18.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:19.000 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:19.001 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:19.982 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:23.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:23.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:28.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:28.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:28.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:28.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:28.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:28.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:31.153 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:02:33.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:38.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:38.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:38.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:38.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:38.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:38.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:43.023 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:43.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:48.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:48.197 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:02:51.031 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:51.059 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:51.060 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:02:53.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:58.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:58.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:02:58.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:02:58.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:02:58.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:02:58.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:03.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:03.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:05.234 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:03:08.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:08.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:08.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:08.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:08.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:08.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:13.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:13.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:13.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:13.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:13.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:13.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:14.072 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:18.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:18.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:18.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:18.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:18.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:18.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:22.084 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:22.274 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:03:23.097 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:23.098 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:23.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:23.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:23.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:23.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:23.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:23.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:28.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:28.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:28.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:28.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:28.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:28.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:33.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:33.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:33.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:33.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:33.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:33.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:38.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:38.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:38.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:38.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:38.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:38.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:39.319 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:03:43.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:43.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:43.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:43.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:43.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:43.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:45.125 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:48.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:53.138 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:53.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:53.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:53.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:53.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:53.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:53.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:55.152 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:55.153 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:03:56.362 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:03:58.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:58.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:03:58.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:03:58.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:03:58.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:03:58.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:03.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:03.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:08.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:08.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:08.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:08.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:08.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:08.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:13.393 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:04:13.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:13.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:13.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:13.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:13.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:13.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:16.187 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:18.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:23.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:23.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:23.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:23.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:23.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:23.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:24.191 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:27.203 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:27.203 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:28.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:28.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:28.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:28.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:28.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:28.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:30.422 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:04:33.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:33.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:38.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:38.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:38.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:38.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:38.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:38.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:43.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:43.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:43.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:43.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:43.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:43.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:47.241 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:47.456 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:04:48.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:48.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:48.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:48.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:48.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:48.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:53.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:53.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:53.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:53.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:53.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:53.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:55.249 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:58.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:58.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:04:58.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:04:58.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:58.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:04:58.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:04:59.266 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:04:59.267 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:03.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:04.498 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:05:08.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:08.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:08.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:08.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:08.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:08.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:13.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:13.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:13.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:13.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:13.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:13.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:18.301 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:18.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:18.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:18.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:18.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:18.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:18.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:21.539 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:05:23.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:23.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:23.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:23.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:23.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:23.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:26.311 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:28.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:28.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:28.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:28.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:28.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:28.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:31.327 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:31.328 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:33.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:33.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:38.576 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:05:39.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:39.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:39.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:39.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:39.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:39.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:44.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:44.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:44.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:44.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:44.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:44.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:49.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:49.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:49.365 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:54.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:54.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:54.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:54.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:54.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:54.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:55.613 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:05:57.375 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:05:59.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:59.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:05:59.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:05:59.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:05:59.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:05:59.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:03.394 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:03.395 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:04.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:04.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:09.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:09.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:09.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:09.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:09.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:09.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:12.651 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:06:14.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:14.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:14.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:14.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:14.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:14.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:19.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:20.429 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:24.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:24.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:24.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:24.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:24.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:24.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:28.435 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:29.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:29.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:29.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:29.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:29.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:29.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:29.692 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:06:34.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:34.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:34.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:34.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:34.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:34.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:35.463 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:35.463 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:39.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:39.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:39.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:39.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:39.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:39.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:44.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:44.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:44.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:44.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:44.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:44.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:46.738 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:06:49.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:49.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:49.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:49.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:49.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:49.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:51.495 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:06:54.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:54.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:54.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:54.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:54.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:54.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:59.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:59.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:06:59.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:06:59.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:59.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:06:59.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:06:59.506 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:03.775 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:07:04.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:04.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:04.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:04.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:04.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:04.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:07.528 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:07.529 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:09.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:14.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:14.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:14.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:14.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:14.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:14.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:19.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:19.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:19.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:19.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:19.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:19.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:20.820 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:07:22.563 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:24.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:24.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:24.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:29.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:29.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:29.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:29.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:29.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:29.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:30.570 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:34.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:34.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:34.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:34.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:34.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:34.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:37.861 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:07:39.584 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:39.585 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:39.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:39.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:39.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:39.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:39.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:39.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:44.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:44.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:44.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:44.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:44.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:44.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:49.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:49.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:49.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:49.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:49.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:49.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:53.616 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:07:54.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:54.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:54.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:54.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:54.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:54.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:54.905 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:07:59.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:59.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:07:59.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:07:59.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:07:59.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:07:59.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:01.627 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:04.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:04.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:04.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:04.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:04.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:04.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:09.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:11.656 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:11.657 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:11.948 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:08:14.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:14.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:14.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:14.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:14.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:14.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:19.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:19.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:19.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:19.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:19.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:19.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:24.689 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:24.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:24.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:24.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:24.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:24.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:24.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:28.992 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:08:29.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:29.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:29.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:29.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:30.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:32.693 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:35.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:35.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:35.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:35.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:35.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:35.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:40.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:43.724 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:43.724 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:08:45.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:45.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:45.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:45.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:45.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:45.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:46.037 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:08:50.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:50.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:50.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:50.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:50.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:50.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:55.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:08:55.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:55.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:08:55.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:55.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:08:55.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:08:55.754 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:00.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5011 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:00.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:00.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:03.081 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:09:03.765 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:05.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:05.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:05.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:05.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:05.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:05.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:10.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:10.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:10.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:10.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:10.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:10.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:15.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:15.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:15.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:15.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:15.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:15.794 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:15.795 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:20.129 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:09:20.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:20.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:20.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:20.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:20.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:20.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:25.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:26.822 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:30.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:30.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:34.831 2 ERROR oslo.messaging._drivers.impl_rabbit [-] Connection failed: [Errno 111] ECONNREFUSED (retrying in 31.0 seconds): ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:35.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:37.167 2 INFO oslo.messaging._drivers.impl_rabbit [-] A recoverable connection/channel error occurred, trying to reconnect: [Errno 111] ECONNREFUSED 2025-10-06 17:09:40.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:45.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248 2025-10-06 17:09:45.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:45.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117 2025-10-06 17:09:45.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:45.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519 2025-10-06 17:09:45.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263 2025-10-06 17:09:47.861 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [6122e38c-4b4c-450f-98ad-ad8cd12ea037] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:47.862 2 ERROR oslo.messaging._drivers.impl_rabbit [-] [9f6b96de-17fe-48f4-88e7-9af8a6ce2856] AMQP server on standalone.internalapi.localdomain:5672 is unreachable: [Errno 111] ECONNREFUSED. Trying again in 32 seconds.: ConnectionRefusedError: [Errno 111] ECONNREFUSED 2025-10-06 17:09:50.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263