Waiting for 60... 60 is ready! Waiting for 30... 30 is ready! Testing libvirt connection (session mode)... Debug info: Current user: root (UID: 0) Groups: root kvm XDG_RUNTIME_DIR: /run/user/989 Extracting connection_uri from /etc/nova/nova.conf... Raw line from config: connection_uri = qemu+unix:///session?socket=/run/user/989/libvirt/libvirt-sock Extracted URI: qemu+unix:///session?socket=/run/user/989/libvirt/libvirt-sock Session socket path: /run/user/989/libvirt/libvirt-sock [OK] Session libvirt directory exists 25 0 drwx------ 10 989 987 260 Mar 7 20:57 /run/user/989/libvirt/ 56 0 drwx------ 2 989 987 40 Mar 7 20:50 /run/user/989/libvirt/hostdevmgr 48 0 drwxr-xr-x 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/qemu 45 0 drwx------ 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/nodedev 41 0 drwxr-xr-x 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/storage 38 0 drwx------ 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/secrets 35 0 drwx------ 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/interface 31 0 drwxr-xr-x 3 989 987 60 Mar 7 20:50 /run/user/989/libvirt/network 29 0 drwx------ 2 989 987 60 Mar 7 20:50 /run/user/989/libvirt/common 28 0 srwxrwx--- 1 989 kvm 0 Mar 7 20:50 /run/user/989/libvirt/libvirt-admin-sock 27 0 srwxrwx--- 1 989 kvm 0 Mar 7 20:50 /run/user/989/libvirt/libvirt-sock 26 4 -rw-r--r-- 1 989 987 5 Mar 7 20:50 /run/user/989/libvirt/libvirtd.pid [OK] Libvirt session connection OK! Starting Nova Compute service... Starting compute node discovery in background... 2026-03-07 20:58:41.114 1 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs 2026-03-07 20:58:41.429 1 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative 2026-03-07 20:58:42.307 1 INFO nova.virt.driver [None req-36e7b430-7d26-4cd4-bf01-f75ffbbaf39c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver' 2026-03-07 20:58:42.431 1 INFO nova.compute.provider_config [None req-36e7b430-7d26-4cd4-bf01-f75ffbbaf39c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. 2026-03-07 20:58:42.972 1 ERROR nova.db.main.api [None req-132cd483-c52d-47df-a064-e6412313864b - - - - - -] No DB access allowed in nova-compute: File "/usr/local/lib/python3.11/dist-packages/eventlet/greenthread.py", line 272, in main result = function(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/utils.py", line 663, in context_wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/context.py", line 422, in gather_result result = fn(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 179, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/objects/service.py", line 557, in _db_service_get_minimum_version return db.service_get_minimum_version(context, binaries) File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 238, in wrapper _check_db_access() File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 188, in _check_db_access stacktrace = ''.join(traceback.format_stack()) 2026-03-07 20:58:42.975 1 ERROR nova.db.main.api [None req-132cd483-c52d-47df-a064-e6412313864b - - - - - -] No DB access allowed in nova-compute: File "/usr/local/lib/python3.11/dist-packages/eventlet/greenthread.py", line 272, in main result = function(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/utils.py", line 663, in context_wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/context.py", line 422, in gather_result result = fn(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 179, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.11/dist-packages/nova/objects/service.py", line 557, in _db_service_get_minimum_version return db.service_get_minimum_version(context, binaries) File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 238, in wrapper _check_db_access() File "/usr/local/lib/python3.11/dist-packages/nova/db/main/api.py", line 188, in _check_db_access stacktrace = ''.join(traceback.format_stack()) 2026-03-07 20:58:42.975 1 WARNING nova.objects.service [None req-132cd483-c52d-47df-a064-e6412313864b - - - - - -] Failed to get minimum service version for cell 7f90e64d-3b32-46fc-a8eb-3ec7679417e5 2026-03-07 20:58:42.976 1 WARNING nova.objects.service [None req-132cd483-c52d-47df-a064-e6412313864b - - - - - -] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 2026-03-07 20:58:43.011 1 INFO nova.service [-] Starting compute node (version 0.0.0) 2026-03-07 20:58:43.539 1 INFO nova.virt.libvirt.driver [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Connection event '1' reason 'None' 2026-03-07 20:58:43.548 1 INFO nova.virt.libvirt.host [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Libvirt host capabilities d3c3aefc-f2f3-4c24-abfd-e12f267e22b4 x86_64 EPYC-Rome-v4 AMD tcp rdma 7864280 1966070 0 0 none 0 hvm 32 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel9.8.0 q35 pc-q35-rhel9.6.0 pc-q35-rhel8.6.0 pc-q35-rhel9.4.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 hvm 64 /usr/libexec/qemu-kvm pc-i440fx-rhel7.6.0 pc pc-q35-rhel9.8.0 q35 pc-q35-rhel9.6.0 pc-q35-rhel8.6.0 pc-q35-rhel9.4.0 pc-q35-rhel8.5.0 pc-q35-rhel8.3.0 pc-q35-rhel7.6.0 pc-q35-rhel8.4.0 pc-q35-rhel9.2.0 pc-q35-rhel8.2.0 pc-q35-rhel9.0.0 pc-q35-rhel8.0.0 pc-q35-rhel8.1.0 2026-03-07 20:58:43.597 1 INFO nova.virt.libvirt.host [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Secure Boot support detected 2026-03-07 20:58:44.125 1 INFO nova.virt.node [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Determined node identity 464b5855-0c9f-4dc7-bf0e-325b23f143ef from /var/lib/nova/compute_id 2026-03-07 20:58:44.126 1 INFO nova.virt.node [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Determined node identity 464b5855-0c9f-4dc7-bf0e-325b23f143ef from /var/lib/nova/compute_id 2026-03-07 20:58:45.648 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host 2026-03-07 20:58:46.744 1 WARNING nova.virt.libvirt.driver [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2026-03-07 20:58:48.945 1 INFO nova.virt.libvirt.host [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] kernel doesn't support AMD SEV Discovery attempt 1/10... Found 2 cell mappings. Skipping cell0 since it does not contain hosts. Getting computes from cell 'cell1': 7f90e64d-3b32-46fc-a8eb-3ec7679417e5 Found 0 unmapped computes in cell: 7f90e64d-3b32-46fc-a8eb-3ec7679417e5 [OK] Compute host discovery successful! (1 host(s) in cell) 2026-03-07 20:59:25.140 1 INFO nova.compute.claims [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Claim successful on node np0005641307.novalocal 2026-03-07 20:59:27.211 1 INFO nova.compute.claims [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Claim successful on node np0005641307.novalocal 2026-03-07 20:59:28.223 1 INFO nova.virt.libvirt.driver [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names 2026-03-07 20:59:29.233 1 INFO nova.virt.block_device [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Booting with volume-backed-image e4532849-66d6-48ad-ad54-0dd08b232f3b at /dev/vda 2026-03-07 20:59:30.293 1 INFO nova.virt.libvirt.driver [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names 2026-03-07 20:59:31.805 1 INFO nova.virt.block_device [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Booting with volume 3e08479b-128c-4ff6-8ed8-312b94742dc9 at /dev/vdb 2026-03-07 20:59:31.850 1 INFO oslo.privsep.daemon [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpk8o5hrwr/privsep.sock'] 2026-03-07 20:59:31.865 1 WARNING oslo.privsep.daemon [-] privsep log: sudo: unable to resolve host nova-compute: Name or service not known 2026-03-07 20:59:32.514 1 INFO oslo.privsep.daemon [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Spawned new privsep daemon via rootwrap 2026-03-07 20:59:32.399 94 INFO oslo.privsep.daemon [-] privsep daemon starting 2026-03-07 20:59:32.404 94 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2026-03-07 20:59:32.409 94 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none 2026-03-07 20:59:32.409 94 INFO oslo.privsep.daemon [-] privsep daemon running as pid 94 2026-03-07 20:59:32.670 1 WARNING os_brick.initiator.connectors.iscsi [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Could not find the iSCSI Initiator File /etc/iscsi/initiatorname.iscsi: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. 2026-03-07 20:59:32.671 1 WARNING os_brick.initiator.connectors.nvmeof [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' 2026-03-07 20:59:32.675 1 WARNING os_brick.initiator.connectors.iscsi [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Could not find the iSCSI Initiator File /etc/iscsi/initiatorname.iscsi: oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while running command. 2026-03-07 20:59:32.700 94 WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' 2026-03-07 20:59:32.706 1 INFO os_brick.initiator.connectors.lightos [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Current host hostNQN and IP(s) are ['38.102.83.39', 'fe80::f816:3eff:fe96:be77', '192.168.122.1', '172.31.0.129', 'fe80::8c0:92ff:fe0d:7648', '172.31.0.1', 'fe80::f003:46ff:fe0f:b189', 'fe80::2c90:67ff:fecd:3a0c', 'fe80::c426:adff:fef4:bab', 'fe80::74e0:7cff:fe31:6944', 'fe80::8892:30ff:feca:7c2e', 'fe80::84dc:c6ff:fe67:4968', 'fe80::42b:f9ff:fe64:9c9a', 'fe80::88dd:73ff:fe01:b70b', 'fe80::98a0:84ff:fe5e:626', 'fe80::a885:f3ff:fe80:d92f', 'fe80::dc86:dcff:fe4e:9ec5', 'fe80::bca4:6aff:fe6c:a672', 'fe80::acd7:32ff:fe4d:775c', 'fe80::30af:60ff:fe56:2bde', 'fe80::b484:47ff:fe99:5fc8', 'fe80::38ff:77ff:fe41:c465', 'fe80::60c9:5cff:fe92:8888', 'fe80::ccc0:90ff:fe55:15ea'] 2026-03-07 20:59:32.716 94 WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' 2026-03-07 20:59:32.720 1 INFO os_brick.initiator.connectors.lightos [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Current host hostNQN and IP(s) are ['38.102.83.39', 'fe80::f816:3eff:fe96:be77', '192.168.122.1', '172.31.0.129', 'fe80::8c0:92ff:fe0d:7648', '172.31.0.1', 'fe80::f003:46ff:fe0f:b189', 'fe80::2c90:67ff:fecd:3a0c', 'fe80::c426:adff:fef4:bab', 'fe80::74e0:7cff:fe31:6944', 'fe80::8892:30ff:feca:7c2e', 'fe80::84dc:c6ff:fe67:4968', 'fe80::42b:f9ff:fe64:9c9a', 'fe80::88dd:73ff:fe01:b70b', 'fe80::98a0:84ff:fe5e:626', 'fe80::a885:f3ff:fe80:d92f', 'fe80::dc86:dcff:fe4e:9ec5', 'fe80::bca4:6aff:fe6c:a672', 'fe80::acd7:32ff:fe4d:775c', 'fe80::30af:60ff:fe56:2bde', 'fe80::b484:47ff:fe99:5fc8', 'fe80::38ff:77ff:fe41:c465', 'fe80::60c9:5cff:fe92:8888', 'fe80::ccc0:90ff:fe55:15ea'] 2026-03-07 20:59:34.850 1 INFO nova.virt.libvirt.driver [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Creating image(s) 2026-03-07 20:59:34.858 1 INFO oslo.privsep.daemon [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmptt0r9s7j/privsep.sock'] 2026-03-07 20:59:34.865 1 INFO nova.virt.libvirt.driver [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Creating image(s) 2026-03-07 20:59:34.870 1 WARNING oslo.privsep.daemon [-] privsep log: sudo: unable to resolve host nova-compute: Name or service not known 2026-03-07 20:59:34.870 1 WARNING nova.virt.libvirt.driver [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2026-03-07 20:59:35.477 1 INFO oslo.privsep.daemon [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Spawned new privsep daemon via rootwrap 2026-03-07 20:59:35.345 112 INFO oslo.privsep.daemon [-] privsep daemon starting 2026-03-07 20:59:35.352 112 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2026-03-07 20:59:35.356 112 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none 2026-03-07 20:59:35.356 112 INFO oslo.privsep.daemon [-] privsep daemon running as pid 112 2026-03-07 20:59:35.479 1 WARNING oslo_privsep.priv_context [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] privsep daemon already running 2026-03-07 20:59:36.189 1 WARNING nova.virt.libvirt.driver [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2026-03-07 20:59:37.044 1 INFO oslo.privsep.daemon [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpsy46pmrb/privsep.sock'] 2026-03-07 20:59:37.055 1 WARNING oslo.privsep.daemon [-] privsep log: sudo: unable to resolve host nova-compute: Name or service not known 2026-03-07 20:59:37.669 1 INFO oslo.privsep.daemon [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Spawned new privsep daemon via rootwrap 2026-03-07 20:59:37.543 144 INFO oslo.privsep.daemon [-] privsep daemon starting 2026-03-07 20:59:37.547 144 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 2026-03-07 20:59:37.549 144 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none 2026-03-07 20:59:37.549 144 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144 2026-03-07 20:59:37.671 1 WARNING oslo_privsep.priv_context [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] privsep daemon already running 2026-03-07 20:59:37.958 1 INFO os_vif [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:4a:ae,bridge_name='hot-int',has_traffic_filtering=True,id=1c68c866-c63a-4d26-a2e1-beec66268a51,network=Network(0d916264-00bc-49d2-91f0-074c75a3a921),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c68c866-c6') 2026-03-07 20:59:37.972 1 INFO os_vif [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ee:9d,bridge_name='hot-int',has_traffic_filtering=True,id=6f325fc8-654c-43d7-a96c-c9c023fc0e35,network=Network(0d916264-00bc-49d2-91f0-074c75a3a921),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6f325fc8-65') 2026-03-07 20:59:39.730 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] VM Started (Lifecycle Event) 2026-03-07 20:59:39.800 1 INFO nova.virt.libvirt.driver [-] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Instance spawned successfully. 2026-03-07 20:59:39.804 1 INFO nova.virt.libvirt.driver [-] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Instance spawned successfully. 2026-03-07 20:59:40.739 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] During sync_power_state the instance has a pending task (spawning). Skip. 2026-03-07 20:59:40.740 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] VM Paused (Lifecycle Event) 2026-03-07 20:59:40.846 1 INFO nova.compute.manager [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Took 6.00 seconds to spawn the instance on the hypervisor. 2026-03-07 20:59:40.848 1 INFO nova.compute.manager [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Took 5.98 seconds to spawn the instance on the hypervisor. 2026-03-07 20:59:41.248 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] VM Started (Lifecycle Event) 2026-03-07 20:59:41.384 1 INFO nova.compute.manager [None req-f40005f5-2ec9-47d5-b14d-c710df5086cf 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Took 16.28 seconds to build instance. 2026-03-07 20:59:41.387 1 INFO nova.compute.manager [None req-97e02dc8-d673-47f9-8103-c3d1acb2b181 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Took 16.27 seconds to build instance. 2026-03-07 20:59:41.819 1 WARNING nova.compute.manager [req-6614709c-4444-4d4c-8848-a5b3dbdea589 req-1320204b-9ec5-4309-ba1a-7a1f6b8de59c 5259aa3da3b14574bbe51077fc18f9d9 cc617b353e7d436abe77cb32b7a52b2b - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Received unexpected event network-vif-plugged-6f325fc8-654c-43d7-a96c-c9c023fc0e35 for instance with vm_state active and task_state None. 2026-03-07 20:59:41.832 1 WARNING nova.compute.manager [req-b9d4ecb4-280a-4dc6-bc46-e4b37f59f0bc req-64e4f01f-9706-4d8b-89f8-5408f0bd6275 5259aa3da3b14574bbe51077fc18f9d9 cc617b353e7d436abe77cb32b7a52b2b - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Received unexpected event network-vif-plugged-1c68c866-c63a-4d26-a2e1-beec66268a51 for instance with vm_state active and task_state None. 2026-03-07 20:59:42.260 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] VM Paused (Lifecycle Event) 2026-03-07 20:59:42.768 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] VM Resumed (Lifecycle Event) 2026-03-07 20:59:43.780 1 INFO nova.compute.manager [None req-823894bd-dbae-4bac-ab7f-1556925be2fa - - - - - -] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] VM Resumed (Lifecycle Event) 2026-03-07 20:59:50.398 1 WARNING nova.virt.libvirt.driver [None req-a30c3836-0553-4b32-81be-f46c8bd1a443 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. 2026-03-07 21:00:20.074 1 INFO nova.compute.manager [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Terminating instance 2026-03-07 21:00:20.806 1 INFO nova.virt.libvirt.driver [-] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Instance destroyed successfully. 2026-03-07 21:00:21.326 1 INFO os_vif [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:ee:9d,bridge_name='hot-int',has_traffic_filtering=True,id=6f325fc8-654c-43d7-a96c-c9c023fc0e35,network=Network(0d916264-00bc-49d2-91f0-074c75a3a921),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6f325fc8-65') 2026-03-07 21:00:21.328 1 INFO nova.virt.libvirt.driver [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Deleting instance files /var/lib/hotstack-os/nova-instances/c4261168-6509-4ef6-8c2c-27ff1a7135f3_del 2026-03-07 21:00:21.329 1 INFO nova.virt.libvirt.driver [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Deletion of /var/lib/hotstack-os/nova-instances/c4261168-6509-4ef6-8c2c-27ff1a7135f3_del complete 2026-03-07 21:00:21.843 1 INFO nova.compute.manager [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Took 1.26 seconds to destroy the instance on the hypervisor. 2026-03-07 21:00:22.280 1 INFO nova.network.neutron [req-94c03437-9b1f-46ca-b92e-e6335f7920d2 req-179fd38f-101f-4c07-8427-6528c1403a24 5259aa3da3b14574bbe51077fc18f9d9 cc617b353e7d436abe77cb32b7a52b2b - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Port 6f325fc8-654c-43d7-a96c-c9c023fc0e35 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache. 2026-03-07 21:00:23.035 1 INFO nova.compute.manager [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Terminating instance 2026-03-07 21:00:23.336 1 INFO nova.compute.manager [-] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Took 1.49 seconds to deallocate network for instance. 2026-03-07 21:00:23.764 1 INFO nova.virt.libvirt.driver [-] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Instance destroyed successfully. 2026-03-07 21:00:23.891 1 INFO nova.compute.manager [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] Took 0.55 seconds to detach 1 volumes for instance. 2026-03-07 21:00:24.276 1 INFO os_vif [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:4a:ae,bridge_name='hot-int',has_traffic_filtering=True,id=1c68c866-c63a-4d26-a2e1-beec66268a51,network=Network(0d916264-00bc-49d2-91f0-074c75a3a921),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1c68c866-c6') 2026-03-07 21:00:24.279 1 INFO nova.virt.libvirt.driver [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Deleting instance files /var/lib/hotstack-os/nova-instances/7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6_del 2026-03-07 21:00:24.279 1 INFO nova.virt.libvirt.driver [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Deletion of /var/lib/hotstack-os/nova-instances/7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6_del complete 2026-03-07 21:00:24.784 1 INFO nova.compute.manager [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Took 1.25 seconds to destroy the instance on the hypervisor. 2026-03-07 21:00:25.373 1 INFO nova.network.neutron [req-f3cfbc8a-b94d-45f8-b4ac-dae89bb41f86 req-e6232584-686e-44f6-8b90-6dee61f38464 5259aa3da3b14574bbe51077fc18f9d9 cc617b353e7d436abe77cb32b7a52b2b - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Port 1c68c866-c63a-4d26-a2e1-beec66268a51 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache. 2026-03-07 21:00:25.530 1 INFO nova.scheduler.client.report [None req-846016df-94f7-4606-93d3-4cc0cd43b980 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Deleted allocations for instance c4261168-6509-4ef6-8c2c-27ff1a7135f3 2026-03-07 21:00:26.166 1 INFO nova.compute.manager [-] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Took 1.38 seconds to deallocate network for instance. 2026-03-07 21:00:26.721 1 INFO nova.compute.manager [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] Took 0.56 seconds to detach 1 volumes for instance. 2026-03-07 21:00:28.335 1 INFO nova.scheduler.client.report [None req-f3bc613f-e77d-45a8-a1a3-b280b6d20817 70381897a25e4c2c9ae46514f1330ea8 9228e4d90852429fad3ab7743c36dc19 - - default default] Deleted allocations for instance 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6 2026-03-07 21:00:35.803 1 INFO nova.compute.manager [-] [instance: c4261168-6509-4ef6-8c2c-27ff1a7135f3] VM Stopped (Lifecycle Event) 2026-03-07 21:00:38.763 1 INFO nova.compute.manager [-] [instance: 7e82bce5-f8bc-45e1-8d3c-77ad2cd96eb6] VM Stopped (Lifecycle Event)