[WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (edpm_node_hostname). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (edpm_node_ip). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (source_galera_members). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (source_mariadb_ip). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (enable_tlse). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (neutron_qe_test). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (tobiko_qe_test). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (neutron_qe_dir). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (prelaunch_barbican_secret). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (os_cloud_name). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (standalone_ip). Using last defined value only. Using /home/zuul/src/review.rdoproject.org/rdo-jobs/playbooks/data_plane_adoption/ansible.cfg as config file PLAY [Externalize Ceph] ******************************************************** TASK [Gathering Facts] ********************************************************* ok: [np0005652755.localdomain] TASK [ceph_migrate : Check file in the src directory] ************************** [WARNING]: Skipped '/home/tripleo-admin/ceph_client' path due to this access issue: '/home/tripleo-admin/ceph_client' is not a directory ok: [np0005652755.localdomain] => {"changed": false, "examined": 0, "files": [], "matched": 0, "msg": "Not all paths examined, check warnings for details", "skipped_paths": {"/home/tripleo-admin/ceph_client": "'/home/tripleo-admin/ceph_client' is not a directory"}} TASK [ceph_migrate : Restore files] ******************************************** skipping: [np0005652755.localdomain] => (item=ceph.conf) => {"ansible_loop_var": "item", "changed": false, "false_condition": "dir_ceph_files.files | length > 0", "item": "ceph.conf", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=ceph.client.admin.keyring) => {"ansible_loop_var": "item", "changed": false, "false_condition": "dir_ceph_files.files | length > 0", "item": "ceph.client.admin.keyring", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure backup directory exists] *************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Health] ****************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "-s", "-f", "json"], "delta": "0:00:02.429331", "end": "2026-03-20 09:44:58.815849", "msg": "", "rc": 0, "start": "2026-03-20 09:44:56.386518", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":14,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652755\",\"np0005652757\",\"np0005652756\"],\"quorum_age\":7249,\"monmap\":{\"epoch\":3,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":607068160,\"bytes_avail\":44464922624,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":6,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652757.yafnuy\",\"status\":\"up:active\",\"gid\":24262}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":66,\"modified\":\"2026-03-20T09:43:11.145468+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":14,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652755\",\"np0005652757\",\"np0005652756\"],\"quorum_age\":7249,\"monmap\":{\"epoch\":3,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":607068160,\"bytes_avail\":44464922624,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":6,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652757.yafnuy\",\"status\":\"up:active\",\"gid\":24262}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":66,\"modified\":\"2026-03-20T09:43:11.145468+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Load ceph data] ******************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph": {"election_epoch": 14, "fsid": "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "fsmap": {"by_rank": [{"filesystem_id": 1, "gid": 24262, "name": "mds.np0005652757.yafnuy", "rank": 0, "status": "up:active"}], "epoch": 6, "id": 1, "in": 1, "max": 1, "up": 1, "up:standby": 2}, "health": {"checks": {}, "mutes": [], "status": "HEALTH_OK"}, "mgrmap": {"available": true, "modules": ["cephadm", "iostat", "nfs", "restful"], "num_standbys": 2, "services": {}}, "monmap": {"epoch": 3, "min_mon_release_name": "reef", "num_mons": 3}, "osdmap": {"epoch": 82, "num_in_osds": 6, "num_osds": 6, "num_remapped_pgs": 0, "num_up_osds": 6, "osd_in_since": 1773992670, "osd_up_since": 1773992691}, "pgmap": {"bytes_avail": 44464922624, "bytes_total": 45071990784, "bytes_used": 607068160, "data_bytes": 109571242, "num_objects": 69, "num_pgs": 177, "num_pools": 7, "pgs_by_state": [{"count": 177, "state_name": "active+clean"}]}, "progress_events": {}, "quorum": [0, 1, 2], "quorum_age": 7249, "quorum_names": ["np0005652755", "np0005652757", "np0005652756"], "servicemap": {"epoch": 66, "modified": "2026-03-20T09:43:11.145468+0000", "services": {}}}}, "changed": false} TASK [ceph_migrate : Dump ceph -s output to log file] ************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Orch ServiceMap] ********************************* changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "ls", "-f", "json"], "delta": "0:00:02.492959", "end": "2026-03-20 09:45:01.907901", "msg": "", "rc": 0, "start": "2026-03-20 09:44:59.414942", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"events\": [\"2026-03-20T07:44:23.894549Z service:crash [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2026-03-20T07:42:24.149793Z\", \"last_refresh\": \"2026-03-20T09:37:44.749714Z\", \"running\": 6, \"size\": 6}}, {\"events\": [\"2026-03-20T08:03:57.970970Z service:mds.mds [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2026-03-20T08:03:51.021331Z\", \"last_refresh\": \"2026-03-20T09:37:44.749793Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:44:11.076379Z service:mgr [INFO] \\\"service was created\\\"\", \"2026-03-20T07:43:15.330356Z service:mgr [ERROR] \\\"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2026-03-20T07:43:06.017590Z\", \"last_refresh\": \"2026-03-20T09:37:44.749634Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:44:02.673932Z service:mon [INFO] \\\"service was created\\\"\", \"2026-03-20T07:43:15.329121Z service:mon [ERROR] \\\"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2026-03-20T07:43:06.008250Z\", \"last_refresh\": \"2026-03-20T09:37:44.749502Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:42:38.074097Z service:node-proxy [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2026-03-20T07:42:38.049883Z\", \"running\": 0, \"size\": 0}}, {\"events\": [\"2026-03-20T07:43:06.034309Z service:osd.default_drive_group [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005652759.localdomain\", \"np0005652760.localdomain\", \"np0005652761.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2026-03-20T07:43:06.026205Z\", \"last_refresh\": \"2026-03-20T09:40:51.922366Z\", \"running\": 6, \"size\": 6}}]", "stdout_lines": ["", "[{\"events\": [\"2026-03-20T07:44:23.894549Z service:crash [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2026-03-20T07:42:24.149793Z\", \"last_refresh\": \"2026-03-20T09:37:44.749714Z\", \"running\": 6, \"size\": 6}}, {\"events\": [\"2026-03-20T08:03:57.970970Z service:mds.mds [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2026-03-20T08:03:51.021331Z\", \"last_refresh\": \"2026-03-20T09:37:44.749793Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:44:11.076379Z service:mgr [INFO] \\\"service was created\\\"\", \"2026-03-20T07:43:15.330356Z service:mgr [ERROR] \\\"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2026-03-20T07:43:06.017590Z\", \"last_refresh\": \"2026-03-20T09:37:44.749634Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:44:02.673932Z service:mon [INFO] \\\"service was created\\\"\", \"2026-03-20T07:43:15.329121Z service:mon [ERROR] \\\"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005652755.localdomain\", \"np0005652756.localdomain\", \"np0005652757.localdomain\"]}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2026-03-20T07:43:06.008250Z\", \"last_refresh\": \"2026-03-20T09:37:44.749502Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T07:42:38.074097Z service:node-proxy [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2026-03-20T07:42:38.049883Z\", \"running\": 0, \"size\": 0}}, {\"events\": [\"2026-03-20T07:43:06.034309Z service:osd.default_drive_group [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005652759.localdomain\", \"np0005652760.localdomain\", \"np0005652761.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2026-03-20T07:43:06.026205Z\", \"last_refresh\": \"2026-03-20T09:40:51.922366Z\", \"running\": 6, \"size\": 6}}]"]} TASK [ceph_migrate : Load Service Map] ***************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"servicemap": [{"events": ["2026-03-20T07:44:23.894549Z service:crash [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2026-03-20T07:42:24.149793Z", "last_refresh": "2026-03-20T09:37:44.749714Z", "running": 6, "size": 6}}, {"events": ["2026-03-20T08:03:57.970970Z service:mds.mds [INFO] \"service was created\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2026-03-20T08:03:51.021331Z", "last_refresh": "2026-03-20T09:37:44.749793Z", "running": 3, "size": 3}}, {"events": ["2026-03-20T07:44:11.076379Z service:mgr [INFO] \"service was created\"", "2026-03-20T07:43:15.330356Z service:mgr [ERROR] \"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2026-03-20T07:43:06.017590Z", "last_refresh": "2026-03-20T09:37:44.749634Z", "running": 3, "size": 3}}, {"events": ["2026-03-20T07:44:02.673932Z service:mon [INFO] \"service was created\"", "2026-03-20T07:43:15.329121Z service:mon [ERROR] \"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_name": "mon", "service_type": "mon", "status": {"created": "2026-03-20T07:43:06.008250Z", "last_refresh": "2026-03-20T09:37:44.749502Z", "running": 3, "size": 3}}, {"events": ["2026-03-20T07:42:38.074097Z service:node-proxy [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2026-03-20T07:42:38.049883Z", "running": 0, "size": 0}}, {"events": ["2026-03-20T07:43:06.034309Z service:osd.default_drive_group [INFO] \"service was created\""], "placement": {"hosts": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2026-03-20T07:43:06.026205Z", "last_refresh": "2026-03-20T09:40:51.922366Z", "running": 6, "size": 6}}]}, "changed": false} TASK [ceph_migrate : Print Service Map] **************************************** skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T07:44:23.894549Z service:crash [INFO] "service was created"'], 'placement': {'host_pattern': '*'}, 'service_name': 'crash', 'service_type': 'crash', 'status': {'created': '2026-03-20T07:42:24.149793Z', 'last_refresh': '2026-03-20T09:37:44.749714Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T07:44:23.894549Z service:crash [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2026-03-20T07:42:24.149793Z", "last_refresh": "2026-03-20T09:37:44.749714Z", "running": 6, "size": 6}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T08:03:57.970970Z service:mds.mds [INFO] "service was created"'], 'placement': {'hosts': ['np0005652755.localdomain', 'np0005652756.localdomain', 'np0005652757.localdomain']}, 'service_id': 'mds', 'service_name': 'mds.mds', 'service_type': 'mds', 'status': {'created': '2026-03-20T08:03:51.021331Z', 'last_refresh': '2026-03-20T09:37:44.749793Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T08:03:57.970970Z service:mds.mds [INFO] \"service was created\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2026-03-20T08:03:51.021331Z", "last_refresh": "2026-03-20T09:37:44.749793Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T07:44:11.076379Z service:mgr [INFO] "service was created"', '2026-03-20T07:43:15.330356Z service:mgr [ERROR] "Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts"'], 'placement': {'hosts': ['np0005652755.localdomain', 'np0005652756.localdomain', 'np0005652757.localdomain']}, 'service_name': 'mgr', 'service_type': 'mgr', 'status': {'created': '2026-03-20T07:43:06.017590Z', 'last_refresh': '2026-03-20T09:37:44.749634Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T07:44:11.076379Z service:mgr [INFO] \"service was created\"", "2026-03-20T07:43:15.330356Z service:mgr [ERROR] \"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2026-03-20T07:43:06.017590Z", "last_refresh": "2026-03-20T09:37:44.749634Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T07:44:02.673932Z service:mon [INFO] "service was created"', '2026-03-20T07:43:15.329121Z service:mon [ERROR] "Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts"'], 'placement': {'hosts': ['np0005652755.localdomain', 'np0005652756.localdomain', 'np0005652757.localdomain']}, 'service_name': 'mon', 'service_type': 'mon', 'status': {'created': '2026-03-20T07:43:06.008250Z', 'last_refresh': '2026-03-20T09:37:44.749502Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T07:44:02.673932Z service:mon [INFO] \"service was created\"", "2026-03-20T07:43:15.329121Z service:mon [ERROR] \"Failed to apply: Cannot place on np0005652757.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "service_name": "mon", "service_type": "mon", "status": {"created": "2026-03-20T07:43:06.008250Z", "last_refresh": "2026-03-20T09:37:44.749502Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T07:42:38.074097Z service:node-proxy [INFO] "service was created"'], 'placement': {'host_pattern': '*'}, 'service_name': 'node-proxy', 'service_type': 'node-proxy', 'status': {'created': '2026-03-20T07:42:38.049883Z', 'running': 0, 'size': 0}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T07:42:38.074097Z service:node-proxy [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2026-03-20T07:42:38.049883Z", "running": 0, "size": 0}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T07:43:06.034309Z service:osd.default_drive_group [INFO] "service was created"'], 'placement': {'hosts': ['np0005652759.localdomain', 'np0005652760.localdomain', 'np0005652761.localdomain']}, 'service_id': 'default_drive_group', 'service_name': 'osd.default_drive_group', 'service_type': 'osd', 'spec': {'data_devices': {'paths': ['/dev/ceph_vg0/ceph_lv0', '/dev/ceph_vg1/ceph_lv1']}, 'filter_logic': 'AND', 'objectstore': 'bluestore'}, 'status': {'created': '2026-03-20T07:43:06.026205Z', 'last_refresh': '2026-03-20T09:40:51.922366Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T07:43:06.034309Z service:osd.default_drive_group [INFO] \"service was created\""], "placement": {"hosts": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2026-03-20T07:43:06.026205Z", "last_refresh": "2026-03-20T09:40:51.922366Z", "running": 6, "size": 6}}} skipping: [np0005652755.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch ls output to log file] ********************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph config] ****************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "config", "dump", "-f", "json"], "delta": "0:00:02.318633", "end": "2026-03-20 09:45:04.938954", "msg": "", "rc": 0, "start": "2026-03-20 09:45:02.620321", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652759\",\"location_type\":\"host\",\"location_value\":\"np0005652759\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652760\",\"location_type\":\"host\",\"location_value\":\"np0005652760\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652761\",\"location_type\":\"host\",\"location_value\":\"np0005652761\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]", "stdout_lines": ["", "[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652759\",\"location_type\":\"host\",\"location_value\":\"np0005652759\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652760\",\"location_type\":\"host\",\"location_value\":\"np0005652760\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652761\",\"location_type\":\"host\",\"location_value\":\"np0005652761\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]"]} TASK [ceph_migrate : Print Ceph config dump] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph config dump output to log file] ***************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Orch Host Map] *********************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "host", "ls", "-f", "json"], "delta": "0:00:02.556861", "end": "2026-03-20 09:45:08.160639", "msg": "", "rc": 0, "start": "2026-03-20 09:45:05.603778", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005652755.localdomain\", \"labels\": [\"_admin\", \"mon\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.104\", \"hostname\": \"np0005652756.localdomain\", \"labels\": [\"mon\", \"_admin\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.105\", \"hostname\": \"np0005652757.localdomain\", \"labels\": [\"mon\", \"_admin\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.106\", \"hostname\": \"np0005652759.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005652760.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005652761.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005652755.localdomain\", \"labels\": [\"_admin\", \"mon\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.104\", \"hostname\": \"np0005652756.localdomain\", \"labels\": [\"mon\", \"_admin\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.105\", \"hostname\": \"np0005652757.localdomain\", \"labels\": [\"mon\", \"_admin\", \"mgr\"], \"status\": \"\"}, {\"addr\": \"192.168.122.106\", \"hostname\": \"np0005652759.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005652760.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005652761.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}]"]} TASK [ceph_migrate : Load nodes] *********************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"nds": [{"addr": "192.168.122.103", "hostname": "np0005652755.localdomain", "labels": ["_admin", "mon", "mgr"], "status": ""}, {"addr": "192.168.122.104", "hostname": "np0005652756.localdomain", "labels": ["mon", "_admin", "mgr"], "status": ""}, {"addr": "192.168.122.105", "hostname": "np0005652757.localdomain", "labels": ["mon", "_admin", "mgr"], "status": ""}, {"addr": "192.168.122.106", "hostname": "np0005652759.localdomain", "labels": ["osd"], "status": ""}, {"addr": "192.168.122.107", "hostname": "np0005652760.localdomain", "labels": ["osd"], "status": ""}, {"addr": "192.168.122.108", "hostname": "np0005652761.localdomain", "labels": ["osd"], "status": ""}]}, "changed": false} TASK [ceph_migrate : Load hostmap List] **************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"hostmap": {"np0005652755.localdomain": ["_admin", "mon", "mgr"], "np0005652756.localdomain": ["mon", "_admin", "mgr"], "np0005652757.localdomain": ["mon", "_admin", "mgr"], "np0005652759.localdomain": ["osd"], "np0005652760.localdomain": ["osd"], "np0005652761.localdomain": ["osd"]}}, "changed": false} TASK [ceph_migrate : Print Host Map] ******************************************* skipping: [np0005652755.localdomain] => (item=np0005652755.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652755.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652756.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652756.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652757.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652757.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652759.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652759.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652760.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652760.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652761.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652761.localdomain"} skipping: [np0005652755.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch host ls output to log file] **************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph monmap and load data] **************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "mon", "dump", "-f", "json"], "delta": "0:00:02.789390", "end": "2026-03-20 09:45:11.712739", "msg": "", "rc": 0, "start": "2026-03-20 09:45:08.923349", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\ndumped monmap epoch 3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /var/lib/ceph/39ff5591-b969-58ac-89fa-bf85e4fa1d90/mon.np0005652755/config", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "dumped monmap epoch 3"], "stdout": "\n{\"epoch\":3,\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"modified\":\"2026-03-20T07:44:03.962644Z\",\"created\":\"2026-03-20T07:41:47.261900Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005652755\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005652757\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005652756\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}", "stdout_lines": ["", "{\"epoch\":3,\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"modified\":\"2026-03-20T07:44:03.962644Z\",\"created\":\"2026-03-20T07:41:47.261900Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005652755\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005652757\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005652756\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}"]} TASK [ceph_migrate : Get Monmap] *********************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_dump": {"created": "2026-03-20T07:41:47.261900Z", "disallowed_leaders: ": "", "election_strategy": 1, "epoch": 3, "features": {"optional": [], "persistent": ["kraken", "luminous", "mimic", "osdmap-prune", "nautilus", "octopus", "pacific", "elector-pinging", "quincy", "reef"]}, "fsid": "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "min_mon_release": 18, "min_mon_release_name": "reef", "modified": "2026-03-20T07:44:03.962644Z", "mons": [{"addr": "172.18.0.103:6789/0", "crush_location": "{}", "name": "np0005652755", "priority": 0, "public_addr": "172.18.0.103:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.103:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.103:6789", "nonce": 0, "type": "v1"}]}, "rank": 0, "weight": 0}, {"addr": "172.18.0.105:6789/0", "crush_location": "{}", "name": "np0005652757", "priority": 0, "public_addr": "172.18.0.105:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.105:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.105:6789", "nonce": 0, "type": "v1"}]}, "rank": 1, "weight": 0}, {"addr": "172.18.0.104:6789/0", "crush_location": "{}", "name": "np0005652756", "priority": 0, "public_addr": "172.18.0.104:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.104:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.104:6789", "nonce": 0, "type": "v1"}]}, "rank": 2, "weight": 0}], "quorum": [0, 1, 2], "removed_ranks: ": "", "stretch_mode": false, "tiebreaker_mon": ""}}, "changed": false} TASK [ceph_migrate : Print monmap] ********************************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph mon dump output to log file] ******************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Load nodes to decommission] ******************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"decomm_nodes": ["np0005652755.localdomain", "np0005652756.localdomain", "np0005652757.localdomain"]}, "changed": false} TASK [ceph_migrate : Load target nodes] **************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"target_nodes": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "changed": false} TASK [ceph_migrate : Print target nodes] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Print decomm_nodes] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : ansible.builtin.fail if input is not provided] ************ skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph is undefined or ceph | length == 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get cluster health] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if health is HEALTH_WARN || HEALTH_ERR] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph.health.status == 'HEALTH_WARN' or ceph.health.status == 'HEALTH_ERR'", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : PgMap] **************************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if PGs are not in active+clean state] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "pgstate != 'active+clean'", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : OSDMap] *************************************************** ok: [np0005652755.localdomain] => { "msg": "100.0" } TASK [ceph_migrate : ansible.builtin.fail if there is an unacceptable OSDs number] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "pct | float < 100", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MonMap] *************************************************** skipping: [np0005652755.localdomain] => {"false_condition": "check_ceph_release | default(false) | bool"} TASK [ceph_migrate : ansible.builtin.fail if Ceph <= Quincy] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "check_ceph_release | default(false) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Mons in quorum] ******************************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if Mons are not in quorum] *********** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph.monmap.num_mons < decomm_nodes | length", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : is Ceph Mgr available] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if Mgr is not available] ************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not ceph.mgrmap.available | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : in progress events] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if there are in progress events] ***** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph.progress_events | length > 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Dump Ceph Status] ***************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : set container image base in ceph configuration] *********** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_base", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest"], "delta": "0:00:00.699818", "end": "2026-03-20 09:45:14.102106", "msg": "", "rc": 0, "start": "2026-03-20 09:45:13.402288", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : set alertmanager container image in ceph configuration] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set grafana container image in ceph configuration] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set node-exporter container image in ceph configuration] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set prometheus container image in ceph configuration] ***** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set haproxy container image in ceph configuration] ******** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_haproxy", "registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest"], "delta": "0:00:00.588563", "end": "2026-03-20 09:45:15.394548", "msg": "", "rc": 0, "start": "2026-03-20 09:45:14.805985", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Set keepalived container image in ceph configuration] ***** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_keepalived", "registry.redhat.io/rhceph/keepalived-rhel9:latest"], "delta": "0:00:00.746359", "end": "2026-03-20 09:45:16.705131", "msg": "", "rc": 0, "start": "2026-03-20 09:45:15.958772", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Update firewall rules on the target nodes] **************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005652755.localdomain => (item=np0005652759.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005652755.localdomain => (item=np0005652760.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005652755.localdomain => (item=np0005652761.localdomain) TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005652755.localdomain -> np0005652759.localdomain(192.168.122.106)] => {"changed": true, "msg": "Block inserted and ownership, perms or SE linux context changed"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ changed: [np0005652755.localdomain -> np0005652759.localdomain(192.168.122.106)] => {"changed": true, "enabled": true, "name": "nftables", "state": "started", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Fri 2026-03-20 07:53:34 UTC", "ActiveEnterTimestampMonotonic": "4463127943", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice sysinit.target basic.target systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Fri 2026-03-20 07:53:34 UTC", "AssertTimestampMonotonic": "4463051183", "Before": "multi-user.target network-pre.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "21850000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2026-03-20 07:53:34 UTC", "ConditionTimestampMonotonic": "4463051181", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Fri 2026-03-20 07:53:34 UTC", "ExecMainExitTimestampMonotonic": "4463127785", "ExecMainPID": "42217", "ExecMainStartTimestamp": "Fri 2026-03-20 07:53:34 UTC", "ExecMainStartTimestampMonotonic": "4463052831", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2026-03-20 07:53:34 UTC", "InactiveExitTimestampMonotonic": "4463053045", "InvocationID": "21e395374d9f456cbcd3cb59762bffbf", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-03-20 07:53:34 UTC", "StateChangeTimestampMonotonic": "4463127943", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005652755.localdomain -> np0005652760.localdomain(192.168.122.107)] => {"changed": true, "msg": "Block inserted and ownership, perms or SE linux context changed"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ changed: [np0005652755.localdomain -> np0005652760.localdomain(192.168.122.107)] => {"changed": true, "enabled": true, "name": "nftables", "state": "started", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Fri 2026-03-20 07:53:36 UTC", "ActiveEnterTimestampMonotonic": "4466673517", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice systemd-journald.socket sysinit.target basic.target", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Fri 2026-03-20 07:53:36 UTC", "AssertTimestampMonotonic": "4466562788", "Before": "network-pre.target multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "32097000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2026-03-20 07:53:36 UTC", "ConditionTimestampMonotonic": "4466562785", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Fri 2026-03-20 07:53:36 UTC", "ExecMainExitTimestampMonotonic": "4466673174", "ExecMainPID": "42310", "ExecMainStartTimestamp": "Fri 2026-03-20 07:53:36 UTC", "ExecMainStartTimestampMonotonic": "4466581299", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2026-03-20 07:53:36 UTC", "InactiveExitTimestampMonotonic": "4466581517", "InvocationID": "096a32d4a5854ce6a0526d1ce5797a77", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-03-20 07:53:36 UTC", "StateChangeTimestampMonotonic": "4466673517", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005652755.localdomain -> np0005652761.localdomain(192.168.122.108)] => {"changed": true, "msg": "Block inserted and ownership, perms or SE linux context changed"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ changed: [np0005652755.localdomain -> np0005652761.localdomain(192.168.122.108)] => {"changed": true, "enabled": true, "name": "nftables", "state": "started", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Fri 2026-03-20 07:53:35 UTC", "ActiveEnterTimestampMonotonic": "4466572892", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target system.slice systemd-journald.socket basic.target", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Fri 2026-03-20 07:53:35 UTC", "AssertTimestampMonotonic": "4466488816", "Before": "network-pre.target shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "29674000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Fri 2026-03-20 07:53:35 UTC", "ConditionTimestampMonotonic": "4466488815", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Fri 2026-03-20 07:53:35 UTC", "ExecMainExitTimestampMonotonic": "4466572595", "ExecMainPID": "41791", "ExecMainStartTimestamp": "Fri 2026-03-20 07:53:35 UTC", "ExecMainStartTimestampMonotonic": "4466490161", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Fri 2026-03-20 07:53:35 UTC", "InactiveExitTimestampMonotonic": "4466490348", "InvocationID": "5bdaf121f67c4dc494871cae1368ab13", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-03-20 07:53:35 UTC", "StateChangeTimestampMonotonic": "4466572892", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Get ceph_cli] ********************************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the dashboard port] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the dashboard ssl port] ******************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Disable mgr dashboard module (restart)] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Enable mgr dashboard module (restart)] ******************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the prometheus server port] *************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the prometheus server address] ************************ skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Enable prometheus module] ********************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => (item=['np0005652759.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005652759.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=['np0005652760.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005652760.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=['np0005652761.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005652761.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : MONITORING - Load Spec from the orchestrator] ************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the loaded data] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Update the Monitoring Stack spec definition] ************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : MONITORING - wait daemons] ******************************** skipping: [np0005652755.localdomain] => (item=grafana) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "grafana", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=prometheus) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "prometheus", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=alertmanager) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "alertmanager", "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Sleep before moving to the next daemon] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MDS - Load Spec from the orchestrator] ******************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mds_spec": {"service_name": "mds.mds", "service_type": "mds", "spec": {}}}, "changed": false} TASK [ceph_migrate : Print the resulting MDS spec] ***************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652755.localdomain", "mds"], "delta": "0:00:00.654700", "end": "2026-03-20 09:45:27.339861", "item": ["np0005652755.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:26.685161", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652755.localdomain", "stdout_lines": ["Added label mds to host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652756.localdomain", "mds"], "delta": "0:00:00.575254", "end": "2026-03-20 09:45:28.410530", "item": ["np0005652756.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:27.835276", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652756.localdomain", "stdout_lines": ["Added label mds to host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652757.localdomain", "mds"], "delta": "0:00:00.620124", "end": "2026-03-20 09:45:29.583107", "item": ["np0005652757.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:28.962983", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652757.localdomain", "stdout_lines": ["Added label mds to host np0005652757.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652759.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652759.localdomain", "mds"], "delta": "0:00:00.694811", "end": "2026-03-20 09:45:30.866529", "item": ["np0005652759.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:30.171718", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652759.localdomain", "stdout_lines": ["Added label mds to host np0005652759.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652760.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652760.localdomain", "mds"], "delta": "0:00:00.705356", "end": "2026-03-20 09:45:32.112253", "item": ["np0005652760.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:31.406897", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652760.localdomain", "stdout_lines": ["Added label mds to host np0005652760.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652761.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652761.localdomain", "mds"], "delta": "0:00:00.661681", "end": "2026-03-20 09:45:33.329725", "item": ["np0005652761.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:45:32.668044", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005652761.localdomain", "stdout_lines": ["Added label mds to host np0005652761.localdomain"]} TASK [ceph_migrate : Update the MDS Daemon spec definition] ******************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mds:/home/tripleo-admin/mds:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mds"], "delta": "0:00:00.721345", "end": "2026-03-20 09:45:34.790487", "rc": 0, "start": "2026-03-20 09:45:34.069142", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mds.mds update...", "stdout_lines": ["Scheduled mds.mds update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Wait for the orchestrator to process the spec] ************ Pausing for 30 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2026-03-20 09:45:34.968132", "stderr": "", "stdout": "Paused for 30.0 seconds", "stop": "2026-03-20 09:46:04.972193", "user_input": ""} TASK [ceph_migrate : Reload the updated mdsmap] ******************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "fs", "status", "cephfs", "-f", "json"], "delta": "0:00:00.708002", "end": "2026-03-20 09:46:06.217502", "msg": "", "rc": 0, "start": "2026-03-20 09:46:05.509500", "stderr": "", "stderr_lines": [], "stdout": "\n{\"clients\": [{\"clients\": 0, \"fs\": \"cephfs\"}], \"mds_version\": [{\"daemon\": [\"mds.np0005652757.yafnuy\", \"mds.np0005652755.umngoz\", \"mds.np0005652761.iysiog\", \"mds.np0005652759.jebckh\", \"mds.np0005652756.ercvjn\", \"mds.np0005652760.wqlirt\"], \"version\": \"ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable)\"}], \"mdsmap\": [{\"caps\": 0, \"dirs\": 12, \"dns\": 10, \"inos\": 13, \"name\": \"mds.np0005652757.yafnuy\", \"rank\": 0, \"rate\": 0, \"state\": \"active\"}, {\"name\": \"mds.np0005652755.umngoz\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652761.iysiog\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652759.jebckh\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652756.ercvjn\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652760.wqlirt\", \"state\": \"standby\"}], \"pools\": [{\"avail\": 14046000128, \"id\": 7, \"name\": \"manila_metadata\", \"type\": \"metadata\", \"used\": 98304}, {\"avail\": 14046000128, \"id\": 6, \"name\": \"manila_data\", \"type\": \"data\", \"used\": 0}]}", "stdout_lines": ["", "{\"clients\": [{\"clients\": 0, \"fs\": \"cephfs\"}], \"mds_version\": [{\"daemon\": [\"mds.np0005652757.yafnuy\", \"mds.np0005652755.umngoz\", \"mds.np0005652761.iysiog\", \"mds.np0005652759.jebckh\", \"mds.np0005652756.ercvjn\", \"mds.np0005652760.wqlirt\"], \"version\": \"ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable)\"}], \"mdsmap\": [{\"caps\": 0, \"dirs\": 12, \"dns\": 10, \"inos\": 13, \"name\": \"mds.np0005652757.yafnuy\", \"rank\": 0, \"rate\": 0, \"state\": \"active\"}, {\"name\": \"mds.np0005652755.umngoz\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652761.iysiog\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652759.jebckh\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652756.ercvjn\", \"state\": \"standby\"}, {\"name\": \"mds.np0005652760.wqlirt\", \"state\": \"standby\"}], \"pools\": [{\"avail\": 14046000128, \"id\": 7, \"name\": \"manila_metadata\", \"type\": \"metadata\", \"used\": 98304}, {\"avail\": 14046000128, \"id\": 6, \"name\": \"manila_data\", \"type\": \"data\", \"used\": 0}]}"]} TASK [ceph_migrate : Get MDS Daemons] ****************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mds_daemons": {"clients": [{"clients": 0, "fs": "cephfs"}], "mds_version": [{"daemon": ["mds.np0005652757.yafnuy", "mds.np0005652755.umngoz", "mds.np0005652761.iysiog", "mds.np0005652759.jebckh", "mds.np0005652756.ercvjn", "mds.np0005652760.wqlirt"], "version": "ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable)"}], "mdsmap": [{"caps": 0, "dirs": 12, "dns": 10, "inos": 13, "name": "mds.np0005652757.yafnuy", "rank": 0, "rate": 0, "state": "active"}, {"name": "mds.np0005652755.umngoz", "state": "standby"}, {"name": "mds.np0005652761.iysiog", "state": "standby"}, {"name": "mds.np0005652759.jebckh", "state": "standby"}, {"name": "mds.np0005652756.ercvjn", "state": "standby"}, {"name": "mds.np0005652760.wqlirt", "state": "standby"}], "pools": [{"avail": 14046000128, "id": 7, "name": "manila_metadata", "type": "metadata", "used": 98304}, {"avail": 14046000128, "id": 6, "name": "manila_data", "type": "data", "used": 0}]}}, "changed": false} TASK [ceph_migrate : Print Daemons] ******************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Get MDS daemons that are not part of decomm nodes] ******** skipping: [np0005652755.localdomain] => (item={'caps': 0, 'dirs': 12, 'dns': 10, 'inos': 13, 'name': 'mds.np0005652757.yafnuy', 'rank': 0, 'rate': 0, 'state': 'active'}) => {"ansible_loop_var": "item", "changed": false, "false_condition": "item.state == \"standby\"", "item": {"caps": 0, "dirs": 12, "dns": 10, "inos": 13, "name": "mds.np0005652757.yafnuy", "rank": 0, "rate": 0, "state": "active"}, "skip_reason": "Conditional result was False"} ok: [np0005652755.localdomain] => (item={'name': 'mds.np0005652755.umngoz', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005652755.umngoz", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005652755.umngoz", "state": "standby"}} ok: [np0005652755.localdomain] => (item={'name': 'mds.np0005652761.iysiog', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005652761.iysiog", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005652761.iysiog", "state": "standby"}} ok: [np0005652755.localdomain] => (item={'name': 'mds.np0005652759.jebckh', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005652759.jebckh", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005652759.jebckh", "state": "standby"}} ok: [np0005652755.localdomain] => (item={'name': 'mds.np0005652756.ercvjn', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005652756.ercvjn", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005652756.ercvjn", "state": "standby"}} ok: [np0005652755.localdomain] => (item={'name': 'mds.np0005652760.wqlirt', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005652760.wqlirt", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005652760.wqlirt", "state": "standby"}} TASK [ceph_migrate : Affinity daemon selected] ********************************* ok: [np0005652755.localdomain] => { "msg": { "name": "mds.np0005652760.wqlirt", "state": "standby" } } TASK [ceph_migrate : Set MDS affinity] ***************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring config set mds.np0005652760.wqlirt mds_join_fs cephfs", "delta": "0:00:00.759845", "end": "2026-03-20 09:46:07.748168", "msg": "", "rc": 0, "start": "2026-03-20 09:46:06.988323", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652755.localdomain", "mds"], "delta": "0:00:00.729825", "end": "2026-03-20 09:46:09.189287", "item": ["np0005652755.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:46:08.459462", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005652755.localdomain", "stdout_lines": ["Removed label mds from host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652756.localdomain", "mds"], "delta": "0:00:00.731766", "end": "2026-03-20 09:46:10.475249", "item": ["np0005652756.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:46:09.743483", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005652756.localdomain", "stdout_lines": ["Removed label mds from host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652757.localdomain", "mds"], "delta": "0:00:00.724847", "end": "2026-03-20 09:46:11.743101", "item": ["np0005652757.localdomain", "mds"], "msg": "", "rc": 0, "start": "2026-03-20 09:46:11.018254", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005652757.localdomain", "stdout_lines": ["Removed label mds from host np0005652757.localdomain"]} TASK [ceph_migrate : Wait daemons] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mds] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mds", "-f", "json"], "delta": "0:00:00.741748", "end": "2026-03-20 09:46:13.209218", "msg": "", "rc": 0, "start": "2026-03-20 09:46:12.467470", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"ffddbd508538\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.05%\", \"created\": \"2026-03-20T08:03:55.785439Z\", \"daemon_id\": \"mds.np0005652755.umngoz\", \"daemon_name\": \"mds.mds.np0005652755.umngoz\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:55.893613Z daemon:mds.mds.np0005652755.umngoz [INFO] \\\"Deployed mds.mds.np0005652755.umngoz on host 'np0005652755.localdomain'\\\"\"], \"hostname\": \"np0005652755.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:37:45.426697Z\", \"memory_usage\": 25983713, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:55.667280Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3e8422af1dcd\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.05%\", \"created\": \"2026-03-20T08:03:57.849725Z\", \"daemon_id\": \"mds.np0005652756.ercvjn\", \"daemon_name\": \"mds.mds.np0005652756.ercvjn\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:57.928589Z daemon:mds.mds.np0005652756.ercvjn [INFO] \\\"Deployed mds.mds.np0005652756.ercvjn on host 'np0005652756.localdomain'\\\"\"], \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:37:45.633114Z\", \"memory_usage\": 28521267, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:57.757768Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"0640d029aa7c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.17%\", \"created\": \"2026-03-20T08:03:53.112217Z\", \"daemon_id\": \"mds.np0005652757.yafnuy\", \"daemon_name\": \"mds.mds.np0005652757.yafnuy\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:53.209421Z daemon:mds.mds.np0005652757.yafnuy [INFO] \\\"Deployed mds.mds.np0005652757.yafnuy on host 'np0005652757.localdomain'\\\"\"], \"hostname\": \"np0005652757.localdomain\", \"is_active\": true, \"last_refresh\": \"2026-03-20T09:37:44.749793Z\", \"memory_usage\": 27839692, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:53.012375Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"0b158166e8d5\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"6.84%\", \"created\": \"2026-03-20T09:45:41.306016Z\", \"daemon_id\": \"mds.np0005652759.jebckh\", \"daemon_name\": \"mds.mds.np0005652759.jebckh\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:41.386246Z daemon:mds.mds.np0005652759.jebckh [INFO] \\\"Deployed mds.mds.np0005652759.jebckh on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.234134Z\", \"memory_usage\": 14061404, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:41.198459Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"5ce856e52ead\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.28%\", \"created\": \"2026-03-20T09:45:39.044459Z\", \"daemon_id\": \"mds.np0005652760.wqlirt\", \"daemon_name\": \"mds.mds.np0005652760.wqlirt\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:39.110118Z daemon:mds.mds.np0005652760.wqlirt [INFO] \\\"Deployed mds.mds.np0005652760.wqlirt on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.309261Z\", \"memory_usage\": 16735272, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:38.938949Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"a712951ec045\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.44%\", \"created\": \"2026-03-20T09:45:36.736480Z\", \"daemon_id\": \"mds.np0005652761.iysiog\", \"daemon_name\": \"mds.mds.np0005652761.iysiog\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:36.818400Z daemon:mds.mds.np0005652761.iysiog [INFO] \\\"Deployed mds.mds.np0005652761.iysiog on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.263295Z\", \"memory_usage\": 16053698, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:36.644656Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"ffddbd508538\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.05%\", \"created\": \"2026-03-20T08:03:55.785439Z\", \"daemon_id\": \"mds.np0005652755.umngoz\", \"daemon_name\": \"mds.mds.np0005652755.umngoz\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:55.893613Z daemon:mds.mds.np0005652755.umngoz [INFO] \\\"Deployed mds.mds.np0005652755.umngoz on host 'np0005652755.localdomain'\\\"\"], \"hostname\": \"np0005652755.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:37:45.426697Z\", \"memory_usage\": 25983713, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:55.667280Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3e8422af1dcd\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.05%\", \"created\": \"2026-03-20T08:03:57.849725Z\", \"daemon_id\": \"mds.np0005652756.ercvjn\", \"daemon_name\": \"mds.mds.np0005652756.ercvjn\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:57.928589Z daemon:mds.mds.np0005652756.ercvjn [INFO] \\\"Deployed mds.mds.np0005652756.ercvjn on host 'np0005652756.localdomain'\\\"\"], \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:37:45.633114Z\", \"memory_usage\": 28521267, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:57.757768Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"0640d029aa7c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.17%\", \"created\": \"2026-03-20T08:03:53.112217Z\", \"daemon_id\": \"mds.np0005652757.yafnuy\", \"daemon_name\": \"mds.mds.np0005652757.yafnuy\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T08:03:53.209421Z daemon:mds.mds.np0005652757.yafnuy [INFO] \\\"Deployed mds.mds.np0005652757.yafnuy on host 'np0005652757.localdomain'\\\"\"], \"hostname\": \"np0005652757.localdomain\", \"is_active\": true, \"last_refresh\": \"2026-03-20T09:37:44.749793Z\", \"memory_usage\": 27839692, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T08:03:53.012375Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"0b158166e8d5\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"6.84%\", \"created\": \"2026-03-20T09:45:41.306016Z\", \"daemon_id\": \"mds.np0005652759.jebckh\", \"daemon_name\": \"mds.mds.np0005652759.jebckh\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:41.386246Z daemon:mds.mds.np0005652759.jebckh [INFO] \\\"Deployed mds.mds.np0005652759.jebckh on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.234134Z\", \"memory_usage\": 14061404, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:41.198459Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"5ce856e52ead\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.28%\", \"created\": \"2026-03-20T09:45:39.044459Z\", \"daemon_id\": \"mds.np0005652760.wqlirt\", \"daemon_name\": \"mds.mds.np0005652760.wqlirt\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:39.110118Z daemon:mds.mds.np0005652760.wqlirt [INFO] \\\"Deployed mds.mds.np0005652760.wqlirt on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.309261Z\", \"memory_usage\": 16735272, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:38.938949Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"a712951ec045\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.44%\", \"created\": \"2026-03-20T09:45:36.736480Z\", \"daemon_id\": \"mds.np0005652761.iysiog\", \"daemon_name\": \"mds.mds.np0005652761.iysiog\", \"daemon_type\": \"mds\", \"events\": [\"2026-03-20T09:45:36.818400Z daemon:mds.mds.np0005652761.iysiog [INFO] \\\"Deployed mds.mds.np0005652761.iysiog on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:45:43.263295Z\", \"memory_usage\": 16053698, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2026-03-20T09:45:36.644656Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : Sleep before moving to the next phase] ******************** Pausing for 30 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2026-03-20 09:46:13.372678", "stderr": "", "stdout": "Paused for 30.02 seconds", "stop": "2026-03-20 09:46:43.395668", "user_input": ""} TASK [ceph_migrate : Get ceph_cli] ********************************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if RGW VIPs are not defined] ************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => (item=['np0005652759.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005652759.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=['np0005652760.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005652760.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => (item=['np0005652761.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005652761.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005652755.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : RGW - Load Spec from the orchestrator] ******************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the loaded data] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Apply ceph rgw keystone config] *************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Update the RGW spec definition] *************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Create the Ingress Daemon spec definition for RGW] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Wait for cephadm to redeploy] ***************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : RGW - wait daemons] *************************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Setup a Ceph client to the first node] ******************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_client.yaml for np0005652755.localdomain TASK [ceph_migrate : TMP_CLIENT - Patch os-net-config config and setup a tmp client IP] *** changed: [np0005652755.localdomain] => {"backup": "/etc/os-net-config/tripleo_config.yaml.475802.2026-03-20@09:46:44~", "changed": true, "msg": "line added and ownership, perms or SE linux context changed"} TASK [ceph_migrate : TMP_CLIENT - Refresh os-net-config] *********************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["os-net-config", "-c", "/etc/os-net-config/tripleo_config.yaml"], "delta": "0:00:07.296999", "end": "2026-03-20 09:46:52.686884", "msg": "", "rc": 0, "start": "2026-03-20 09:46:45.389885", "stderr": "", "stderr_lines": [], "stdout": "2026-03-20 09:46:46.275 ERROR os_net_config.execute stderr : WARN : [ifdown] You are using 'ifdown' script provided by 'network-scripts', which are now deprecated.\nWARN : [ifdown] 'network-scripts' will be removed from distribution in near future.\nWARN : [ifdown] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.\n\n2026-03-20 09:46:52.619 ERROR os_net_config.execute stderr : WARN : [ifup] You are using 'ifup' script provided by 'network-scripts', which are now deprecated.\nWARN : [ifup] 'network-scripts' will be removed from distribution in near future.\nWARN : [ifup] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.", "stdout_lines": ["2026-03-20 09:46:46.275 ERROR os_net_config.execute stderr : WARN : [ifdown] You are using 'ifdown' script provided by 'network-scripts', which are now deprecated.", "WARN : [ifdown] 'network-scripts' will be removed from distribution in near future.", "WARN : [ifdown] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.", "", "2026-03-20 09:46:52.619 ERROR os_net_config.execute stderr : WARN : [ifup] You are using 'ifup' script provided by 'network-scripts', which are now deprecated.", "WARN : [ifup] 'network-scripts' will be removed from distribution in near future.", "WARN : [ifup] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well."]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005652755.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005652755.localdomain] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005652755.localdomain] => {"changed": false, "examined": 4, "files": [{"atime": 1773993799.7654786, "ctime": 1773993798.68345, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 109052589, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773992692.3221598, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 271, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773993799.777479, "ctime": 1773993798.68445, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 109052588, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1773992551.9966078, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773993842.578621, "ctime": 1773993840.6275687, "dev": 64516, "gid": 167, "gr_name": "", "inode": 570471057, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993840.3125603, "nlink": 1, "path": "/etc/ceph/ceph.client.openstack.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773993843.740652, "ctime": 1773993841.5405931, "dev": 64516, "gid": 167, "gr_name": "", "inode": 595628501, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993841.2205846, "nlink": 1, "path": "/etc/ceph/ceph.client.manila.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005652755.localdomain] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 271, 'inode': 109052589, 'dev': 64516, 'nlink': 1, 'atime': 1773993799.7654786, 'mtime': 1773992692.3221598, 'ctime': 1773993798.68345, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "61f693fca717c642726de4660061d4efa1f37bd6", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1773993799.7654786, "ctime": 1773993798.68345, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 109052589, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773992692.3221598, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 271, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "67863e1919a59ea5d169c661d673c6a9", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 271, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005652755.localdomain] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 109052588, 'dev': 64516, 'nlink': 1, 'atime': 1773993799.777479, 'mtime': 1773992551.9966078, 'ctime': 1773993798.68445, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "fd9bfb4214b0e14a9ab3b9114f5c6030f9b187ed", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1773993799.777479, "ctime": 1773993798.68445, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 109052588, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1773992551.9966078, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "00437778ea8eb37cb72a6b283cb52066", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} changed: [np0005652755.localdomain] => (item={'path': '/etc/ceph/ceph.client.openstack.keyring', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 167, 'gid': 167, 'size': 231, 'inode': 570471057, 'dev': 64516, 'nlink': 1, 'atime': 1773993842.578621, 'mtime': 1773993840.3125603, 'ctime': 1773993840.6275687, 'gr_name': '', 'pw_name': '', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "2febd4926aea2483e17f98ad2caaedaf44a897e8", "dest": "/home/tripleo-admin/ceph_client/ceph.client.openstack.keyring", "gid": 0, "group": "root", "item": {"atime": 1773993842.578621, "ctime": 1773993840.6275687, "dev": 64516, "gid": 167, "gr_name": "", "inode": 570471057, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993840.3125603, "nlink": 1, "path": "/etc/ceph/ceph.client.openstack.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "bed4fc47f91cefb81761a546916c12fc", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 231, "src": "/etc/ceph/ceph.client.openstack.keyring", "state": "file", "uid": 0} changed: [np0005652755.localdomain] => (item={'path': '/etc/ceph/ceph.client.manila.keyring', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 167, 'gid': 167, 'size': 153, 'inode': 595628501, 'dev': 64516, 'nlink': 1, 'atime': 1773993843.740652, 'mtime': 1773993841.2205846, 'ctime': 1773993841.5405931, 'gr_name': '', 'pw_name': '', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "12c24e7d6d1787d828171df1f51563103f484814", "dest": "/home/tripleo-admin/ceph_client/ceph.client.manila.keyring", "gid": 0, "group": "root", "item": {"atime": 1773993843.740652, "ctime": 1773993841.5405931, "dev": 64516, "gid": 167, "gr_name": "", "inode": 595628501, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993841.2205846, "nlink": 1, "path": "/etc/ceph/ceph.client.manila.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "941ad89e7776647a7c3a546d10d756f6", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 153, "src": "/etc/ceph/ceph.client.manila.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : Render global ceph.conf] ********************************** changed: [np0005652755.localdomain] => {"changed": true, "checksum": "e4b8377a1e31f6e329c10fe03dac85aea504386f", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "md5sum": "21d4fb0ad6afd5c3bbd80853e6e7caaf", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 142, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000016.874326-61591-203545646482631/source", "state": "file", "uid": 0} TASK [ceph_migrate : MGR - Migrate RBD node] *********************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MGR - Setup Mon/Mgr label to the target node] ************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005652755.localdomain TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005652755.localdomain] => (item=['np0005652759.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652759.localdomain", "mgr"], "delta": "0:00:00.706128", "end": "2026-03-20 09:46:59.899714", "item": ["np0005652759.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:46:59.193586", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005652759.localdomain", "stdout_lines": ["Added label mgr to host np0005652759.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652760.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652760.localdomain", "mgr"], "delta": "0:00:00.646670", "end": "2026-03-20 09:47:01.149334", "item": ["np0005652760.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:00.502664", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005652760.localdomain", "stdout_lines": ["Added label mgr to host np0005652760.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652761.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652761.localdomain", "mgr"], "delta": "0:00:00.707042", "end": "2026-03-20 09:47:02.412926", "item": ["np0005652761.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:01.705884", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005652761.localdomain", "stdout_lines": ["Added label mgr to host np0005652761.localdomain"]} TASK [ceph_migrate : MGR - Load Spec from the orchestrator] ******************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mgr_spec": {"service_name": "mgr", "service_type": "mgr", "spec": {}}}, "changed": false} TASK [ceph_migrate : Update the MGR Daemon spec definition] ******************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mgr:/home/tripleo-admin/mgr:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mgr"], "delta": "0:00:00.904686", "end": "2026-03-20 09:47:04.034764", "rc": 0, "start": "2026-03-20 09:47:03.130078", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mgr update...", "stdout_lines": ["Scheduled mgr update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MGR - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mgr] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mgr", "-f", "json"], "delta": "0:00:00.670494", "end": "2026-03-20 09:47:05.485592", "msg": "", "rc": 0, "start": "2026-03-20 09:47:04.815098", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"9c27c4e992ea\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.52%\", \"created\": \"2026-03-20T07:41:54.580085Z\", \"daemon_id\": \"np0005652755.qlfqum\", \"daemon_name\": \"mgr.np0005652755.qlfqum\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:55.432741Z daemon:mgr.np0005652755.qlfqum [INFO] \\\"Reconfigured mgr.np0005652755.qlfqum on host 'np0005652755.localdomain'\\\"\"], \"hostname\": \"np0005652755.localdomain\", \"is_active\": true, \"last_refresh\": \"2026-03-20T09:46:36.336868Z\", \"memory_usage\": 541904076, \"ports\": [9283, 8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:41:53.927880Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3c98d2e3e473\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.19%\", \"created\": \"2026-03-20T07:44:10.968891Z\", \"daemon_id\": \"np0005652756.blnbtd\", \"daemon_name\": \"mgr.np0005652756.blnbtd\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:11.049862Z daemon:mgr.np0005652756.blnbtd [INFO] \\\"Deployed mgr.np0005652756.blnbtd on host 'np0005652756.localdomain'\\\"\"], \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:46:36.402917Z\", \"memory_usage\": 473327206, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:44:10.857396Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3a145016d6af\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.25%\", \"created\": \"2026-03-20T07:44:07.738003Z\", \"daemon_id\": \"np0005652757.puuyvp\", \"daemon_name\": \"mgr.np0005652757.puuyvp\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:09.140825Z daemon:mgr.np0005652757.puuyvp [INFO] \\\"Deployed mgr.np0005652757.puuyvp on host 'np0005652757.localdomain'\\\"\", \"2026-03-20T07:45:01.346626Z daemon:mgr.np0005652757.puuyvp [INFO] \\\"Reconfigured mgr.np0005652757.puuyvp on host 'np0005652757.localdomain'\\\"\"], \"hostname\": \"np0005652757.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:46:36.256374Z\", \"memory_usage\": 471544627, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:44:04.728973Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"9c27c4e992ea\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.52%\", \"created\": \"2026-03-20T07:41:54.580085Z\", \"daemon_id\": \"np0005652755.qlfqum\", \"daemon_name\": \"mgr.np0005652755.qlfqum\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:55.432741Z daemon:mgr.np0005652755.qlfqum [INFO] \\\"Reconfigured mgr.np0005652755.qlfqum on host 'np0005652755.localdomain'\\\"\"], \"hostname\": \"np0005652755.localdomain\", \"is_active\": true, \"last_refresh\": \"2026-03-20T09:46:36.336868Z\", \"memory_usage\": 541904076, \"ports\": [9283, 8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:41:53.927880Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3c98d2e3e473\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.19%\", \"created\": \"2026-03-20T07:44:10.968891Z\", \"daemon_id\": \"np0005652756.blnbtd\", \"daemon_name\": \"mgr.np0005652756.blnbtd\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:11.049862Z daemon:mgr.np0005652756.blnbtd [INFO] \\\"Deployed mgr.np0005652756.blnbtd on host 'np0005652756.localdomain'\\\"\"], \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:46:36.402917Z\", \"memory_usage\": 473327206, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:44:10.857396Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}, {\"container_id\": \"3a145016d6af\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.25%\", \"created\": \"2026-03-20T07:44:07.738003Z\", \"daemon_id\": \"np0005652757.puuyvp\", \"daemon_name\": \"mgr.np0005652757.puuyvp\", \"daemon_type\": \"mgr\", \"events\": [\"2026-03-20T07:44:09.140825Z daemon:mgr.np0005652757.puuyvp [INFO] \\\"Deployed mgr.np0005652757.puuyvp on host 'np0005652757.localdomain'\\\"\", \"2026-03-20T07:45:01.346626Z daemon:mgr.np0005652757.puuyvp [INFO] \\\"Reconfigured mgr.np0005652757.puuyvp on host 'np0005652757.localdomain'\\\"\"], \"hostname\": \"np0005652757.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:46:36.256374Z\", \"memory_usage\": 471544627, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2026-03-20T07:44:04.728973Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : MON - Load Spec from the orchestrator] ******************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_spec": {"service_name": "mon", "service_type": "mon", "spec": {}}}, "changed": false} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652755.localdomain", "mon"], "delta": "0:00:00.660346", "end": "2026-03-20 09:47:06.941153", "item": ["np0005652755.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:06.280807", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652755.localdomain", "stdout_lines": ["Added label mon to host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652755.localdomain", "_admin"], "delta": "0:00:01.142205", "end": "2026-03-20 09:47:08.617911", "item": ["np0005652755.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:07.475706", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652755.localdomain", "stdout_lines": ["Added label _admin to host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652756.localdomain", "mon"], "delta": "0:00:00.665988", "end": "2026-03-20 09:47:09.853738", "item": ["np0005652756.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:09.187750", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652756.localdomain", "stdout_lines": ["Added label mon to host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652756.localdomain", "_admin"], "delta": "0:00:00.781080", "end": "2026-03-20 09:47:11.167245", "item": ["np0005652756.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:10.386165", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652756.localdomain", "stdout_lines": ["Added label _admin to host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652757.localdomain", "mon"], "delta": "0:00:00.696768", "end": "2026-03-20 09:47:12.440380", "item": ["np0005652757.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:11.743612", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652757.localdomain", "stdout_lines": ["Added label mon to host np0005652757.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652757.localdomain", "_admin"], "delta": "0:00:00.669501", "end": "2026-03-20 09:47:13.645784", "item": ["np0005652757.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:12.976283", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652757.localdomain", "stdout_lines": ["Added label _admin to host np0005652757.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652759.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652759.localdomain", "mon"], "delta": "0:00:00.723765", "end": "2026-03-20 09:47:14.981862", "item": ["np0005652759.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:14.258097", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652759.localdomain", "stdout_lines": ["Added label mon to host np0005652759.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652759.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652759.localdomain", "_admin"], "delta": "0:00:00.675450", "end": "2026-03-20 09:47:16.277183", "item": ["np0005652759.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:15.601733", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652759.localdomain", "stdout_lines": ["Added label _admin to host np0005652759.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652760.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652760.localdomain", "mon"], "delta": "0:00:00.772776", "end": "2026-03-20 09:47:17.621665", "item": ["np0005652760.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:16.848889", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652760.localdomain", "stdout_lines": ["Added label mon to host np0005652760.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652760.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652760.localdomain", "_admin"], "delta": "0:00:00.732037", "end": "2026-03-20 09:47:18.935610", "item": ["np0005652760.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:18.203573", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652760.localdomain", "stdout_lines": ["Added label _admin to host np0005652760.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652761.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652761.localdomain", "mon"], "delta": "0:00:00.731357", "end": "2026-03-20 09:47:20.243871", "item": ["np0005652761.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:19.512514", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005652761.localdomain", "stdout_lines": ["Added label mon to host np0005652761.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652761.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005652761.localdomain", "_admin"], "delta": "0:00:00.781417", "end": "2026-03-20 09:47:21.566355", "item": ["np0005652761.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:47:20.784938", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005652761.localdomain", "stdout_lines": ["Added label _admin to host np0005652761.localdomain"]} TASK [ceph_migrate : Normalize the mon spec to use labels] ********************* ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.774068", "end": "2026-03-20 09:47:23.056753", "rc": 0, "start": "2026-03-20 09:47:22.282685", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : RBD - wait new daemons to be available] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain => (item=np0005652759.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain => (item=np0005652760.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain => (item=np0005652761.localdomain) TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* FAILED - RETRYING: [np0005652755.localdomain]: wait for mon (200 retries left). FAILED - RETRYING: [np0005652755.localdomain]: wait for mon (199 retries left). changed: [np0005652755.localdomain] => {"attempts": 3, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652759", "-f", "json"], "delta": "0:00:06.712296", "end": "2026-03-20 09:47:46.304296", "msg": "", "rc": 0, "start": "2026-03-20 09:47:39.592000", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"13df329a2298\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.72%\", \"created\": \"2026-03-20T09:47:36.090516Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:40.274486Z daemon:mon.np0005652759 [INFO] \\\"Deployed mon.np0005652759 on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.519051Z\", \"memory_request\": 2147483648, \"memory_usage\": 47081062, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:35.977908Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"13df329a2298\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.72%\", \"created\": \"2026-03-20T09:47:36.090516Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:40.274486Z daemon:mon.np0005652759 [INFO] \\\"Deployed mon.np0005652759 on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.519051Z\", \"memory_request\": 2147483648, \"memory_usage\": 47081062, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:35.977908Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652760", "-f", "json"], "delta": "0:00:00.830171", "end": "2026-03-20 09:47:47.827057", "msg": "", "rc": 0, "start": "2026-03-20 09:47:46.996886", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"314789bdcc03\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.38%\", \"created\": \"2026-03-20T09:47:30.847884Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:33.575275Z daemon:mon.np0005652760 [INFO] \\\"Deployed mon.np0005652760 on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.531775Z\", \"memory_request\": 2147483648, \"memory_usage\": 44920995, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:30.748114Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"314789bdcc03\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.38%\", \"created\": \"2026-03-20T09:47:30.847884Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:33.575275Z daemon:mon.np0005652760 [INFO] \\\"Deployed mon.np0005652760 on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.531775Z\", \"memory_request\": 2147483648, \"memory_usage\": 44920995, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:30.748114Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652761", "-f", "json"], "delta": "0:00:00.767955", "end": "2026-03-20 09:47:49.346615", "msg": "", "rc": 0, "start": "2026-03-20 09:47:48.578660", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"81878b4169a8\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.61%\", \"created\": \"2026-03-20T09:47:28.092398Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:28.185018Z daemon:mon.np0005652761 [INFO] \\\"Deployed mon.np0005652761 on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.547085Z\", \"memory_request\": 2147483648, \"memory_usage\": 44480593, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:27.985362Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"81878b4169a8\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.61%\", \"created\": \"2026-03-20T09:47:28.092398Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:47:28.185018Z daemon:mon.np0005652761 [INFO] \\\"Deployed mon.np0005652761 on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:42.547085Z\", \"memory_request\": 2147483648, \"memory_usage\": 44480593, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:27.985362Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : MON - Migrate RBD node] *********************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005652755.localdomain => (item=['np0005652755.localdomain', 'np0005652759.localdomain']) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005652755.localdomain => (item=['np0005652756.localdomain', 'np0005652760.localdomain']) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005652755.localdomain => (item=['np0005652757.localdomain', 'np0005652761.localdomain']) TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005652755.localdomain] => { "msg": "Migrate mon: np0005652755.localdomain to node: np0005652759.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.103"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.775403", "end": "2026-03-20 09:47:51.198519", "msg": "", "rc": 0, "start": "2026-03-20 09:47:50.423116", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":28,\"quorum\":[0,1,2,3,4,5],\"quorum_names\":[\"np0005652755\",\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":4,\"monmap\":{\"epoch\":6,\"min_mon_release_name\":\"reef\",\"num_mons\":6},\"osdmap\":{\"epoch\":85,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":607301632,\"bytes_avail\":44464689152,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":73,\"modified\":\"2026-03-20T09:47:21.267364+0000\",\"services\":{\"mgr\":{\"daemons\":{\"summary\":\"\",\"np0005652759.zlbibv\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005652760.qrhkos\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005652761.zmdusi\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}}}}}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":28,\"quorum\":[0,1,2,3,4,5],\"quorum_names\":[\"np0005652755\",\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":4,\"monmap\":{\"epoch\":6,\"min_mon_release_name\":\"reef\",\"num_mons\":6},\"osdmap\":{\"epoch\":85,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":607301632,\"bytes_avail\":44464689152,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":73,\"modified\":\"2026-03-20T09:47:21.267364+0000\",\"services\":{\"mgr\":{\"daemons\":{\"summary\":\"\",\"np0005652759.zlbibv\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005652760.qrhkos\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005652761.zmdusi\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}}}}}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "cur_mon != client_node", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.797174", "end": "2026-03-20 09:47:52.747317", "msg": "", "rc": 0, "start": "2026-03-20 09:47:51.950143", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":14,\"available\":true,\"active_name\":\"np0005652755.qlfqum\",\"num_standby\":5}", "stdout_lines": ["", "{\"epoch\":14,\"available\":true,\"active_name\":\"np0005652755.qlfqum\",\"num_standby\":5}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005652755.qlfqum", "available": true, "epoch": 14, "num_standby": 5}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.893513", "end": "2026-03-20 09:47:54.495475", "msg": "", "rc": 0, "start": "2026-03-20 09:47:53.601962", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:47:54.626225", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:48:04.637915", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005652755.localdomain: jid=j228370325295.482846 changed: [np0005652755.localdomain] => {"ansible_job_id": "j228370325295.482846", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.650359", "end": "2026-03-20 09:48:06.199592", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j228370325295.482846", "start": "2026-03-20 09:48:05.549233", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005652755.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652755", "-f", "json"], "delta": "0:00:00.651664", "end": "2026-03-20 09:48:08.901905", "msg": "", "rc": 0, "start": "2026-03-20 09:48:08.250241", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"90d579574537\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.77%\", \"created\": \"2026-03-20T07:41:49.412949Z\", \"daemon_id\": \"np0005652755\", \"daemon_name\": \"mon.np0005652755\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652755.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:57.278134Z\", \"memory_request\": 2147483648, \"memory_usage\": 138726604, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:41:52.380235Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"90d579574537\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.77%\", \"created\": \"2026-03-20T07:41:49.412949Z\", \"daemon_id\": \"np0005652755\", \"daemon_name\": \"mon.np0005652755\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652755.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:57.278134Z\", \"memory_request\": 2147483648, \"memory_usage\": 138726604, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:41:52.380235Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652755", "--force"], "delta": "0:00:05.803300", "end": "2026-03-20 09:48:15.291171", "msg": "", "rc": 0, "start": "2026-03-20 09:48:09.487871", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652755 from host 'np0005652755.localdomain'", "stdout_lines": ["Removed mon.np0005652755 from host 'np0005652755.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005652755.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652755.localdomain", "mon"], "delta": "0:00:00.760510", "end": "2026-03-20 09:48:16.927174", "item": ["np0005652755.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:48:16.166664", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005652755.localdomain", "stdout_lines": ["Removed label mon from host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652755.localdomain", "mgr"], "delta": "0:00:00.670025", "end": "2026-03-20 09:48:18.154720", "item": ["np0005652755.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:48:17.484695", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005652755.localdomain", "stdout_lines": ["Removed label mgr from host np0005652755.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652755.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652755.localdomain", "_admin"], "delta": "0:00:00.738302", "end": "2026-03-20 09:48:19.431067", "item": ["np0005652755.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:48:18.692765", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005652755.localdomain", "stdout_lines": ["Removed label _admin from host np0005652755.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:48:19.572694", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:48:29.586189", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005652755.localdomain"], "delta": "0:00:00.763627", "end": "2026-03-20 09:48:30.970592", "msg": "", "rc": 0, "start": "2026-03-20 09:48:30.206965", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005652755.localdomain'\ntype id \n-------------------- ---------------\nmgr np0005652755.qlfqum\ncrash np0005652755 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005652755.localdomain'", "type id ", "-------------------- ---------------", "mgr np0005652755.qlfqum", "crash np0005652755 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005652755.localdomain", "-f", "json"], "delta": "0:00:00.895308", "end": "2026-03-20 09:48:32.613478", "msg": "", "rc": 0, "start": "2026-03-20 09:48:31.718170", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005652755.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005652755.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005652755.localdomain", "--force"], "delta": "0:00:00.720151", "end": "2026-03-20 09:48:34.046165", "msg": "", "rc": 0, "start": "2026-03-20 09:48:33.326014", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005652755.localdomain'", "stdout_lines": ["Removed host 'np0005652755.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005652755.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005652755.localdomain] => {"backup": "/etc/os-net-config/tripleo_config.yaml.485238.2026-03-20@09:48:34~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.103/24", "dev", "vlan21"], "delta": "0:00:00.004870", "end": "2026-03-20 09:48:35.451647", "msg": "", "rc": 0, "start": "2026-03-20 09:48:35.446777", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005652755.localdomain -> np0005652759.localdomain(192.168.122.106)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.289636.2026-03-20@09:48:36~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005652755.localdomain -> np0005652759.localdomain(192.168.122.106)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.103/24", "dev", "vlan21"], "delta": "0:00:00.005905", "end": "2026-03-20 09:48:37.582207", "msg": "", "rc": 0, "start": "2026-03-20 09:48:37.576302", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005652755.localdomain -> np0005652759.localdomain(192.168.122.106)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.103"], "delta": "0:00:02.050423", "end": "2026-03-20 09:48:40.406772", "msg": "", "rc": 0, "start": "2026-03-20 09:48:38.356349", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.103 (172.18.0.103) 56(84) bytes of data.\n64 bytes from 172.18.0.103: icmp_seq=1 ttl=64 time=0.053 ms\n64 bytes from 172.18.0.103: icmp_seq=2 ttl=64 time=0.076 ms\n64 bytes from 172.18.0.103: icmp_seq=3 ttl=64 time=0.532 ms\n\n--- 172.18.0.103 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2045ms\nrtt min/avg/max/mdev = 0.053/0.220/0.532/0.220 ms", "stdout_lines": ["PING 172.18.0.103 (172.18.0.103) 56(84) bytes of data.", "64 bytes from 172.18.0.103: icmp_seq=1 ttl=64 time=0.053 ms", "64 bytes from 172.18.0.103: icmp_seq=2 ttl=64 time=0.076 ms", "64 bytes from 172.18.0.103: icmp_seq=3 ttl=64 time=0.532 ms", "", "--- 172.18.0.103 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2045ms", "rtt min/avg/max/mdev = 0.053/0.220/0.532/0.220 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.692494", "end": "2026-03-20 09:48:41.797091", "rc": 0, "start": "2026-03-20 09:48:41.104597", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652759", "-f", "json"], "delta": "0:00:00.678174", "end": "2026-03-20 09:48:43.182792", "msg": "", "rc": 0, "start": "2026-03-20 09:48:42.504618", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"13df329a2298\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.73%\", \"created\": \"2026-03-20T09:47:36.090516Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:48:15.978305Z daemon:mon.np0005652759 [INFO] \\\"Reconfigured mon.np0005652759 on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:56.502968Z\", \"memory_request\": 2147483648, \"memory_usage\": 53896806, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:35.977908Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"13df329a2298\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.73%\", \"created\": \"2026-03-20T09:47:36.090516Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:48:15.978305Z daemon:mon.np0005652759 [INFO] \\\"Reconfigured mon.np0005652759 on host 'np0005652759.localdomain'\\\"\"], \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:47:56.502968Z\", \"memory_request\": 2147483648, \"memory_usage\": 53896806, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:35.977908Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652759", "--force"], "delta": "0:00:02.228828", "end": "2026-03-20 09:48:46.062241", "msg": "", "rc": 0, "start": "2026-03-20 09:48:43.833413", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652759 from host 'np0005652759.localdomain'", "stdout_lines": ["Removed mon.np0005652759 from host 'np0005652759.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:48:46.213324", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:48:56.222648", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005652759.localdomain] *********** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005652759.localdomain] *********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005652759.localdomain:172.18.0.103"], "delta": "0:00:03.378981", "end": "2026-03-20 09:49:00.507359", "msg": "", "rc": 0, "start": "2026-03-20 09:48:57.128378", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005652759 on host 'np0005652759.localdomain'", "stdout_lines": ["Deployed mon.np0005652759 on host 'np0005652759.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:49:00.661417", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:49:10.675019", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.730721", "end": "2026-03-20 09:49:11.891255", "msg": "", "rc": 0, "start": "2026-03-20 09:49:11.160534", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":42,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":5,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":86,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109572084,\"bytes_used\":607358976,\"bytes_avail\":44464631808,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":77,\"modified\":\"2026-03-20T09:49:02.382880+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":42,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":5,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":86,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109572084,\"bytes_used\":607358976,\"bytes_avail\":44464631808,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":77,\"modified\":\"2026-03-20T09:49:02.382880+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:01.261345", "end": "2026-03-20 09:49:13.766594", "msg": "", "rc": 0, "start": "2026-03-20 09:49:12.505249", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.5 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.1 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.4 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.0 on host 'np0005652761.localdomain'\nScheduled to reconfig osd.3 on host 'np0005652761.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005652761.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005652761.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.918001", "end": "2026-03-20 09:49:15.449674", "msg": "", "rc": 0, "start": "2026-03-20 09:49:14.531673", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:49:15.569060", "stderr": "", "stdout": "Paused for 10.05 seconds", "stop": "2026-03-20 09:49:25.620653", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005652755.localdomain: jid=j63163174748.486898 changed: [np0005652755.localdomain] => {"ansible_job_id": "j63163174748.486898", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.718247", "end": "2026-03-20 09:49:27.121030", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j63163174748.486898", "start": "2026-03-20 09:49:26.402783", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.728425", "end": "2026-03-20 09:49:29.747909", "rc": 0, "start": "2026-03-20 09:49:29.019484", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652759", "-f", "json"], "delta": "0:00:00.862805", "end": "2026-03-20 09:49:31.362498", "msg": "", "rc": 0, "start": "2026-03-20 09:49:30.499693", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"6a730db4ef21\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"4.41%\", \"created\": \"2026-03-20T09:49:00.288078Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.428484Z\", \"memory_request\": 2147483648, \"memory_usage\": 43799019, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:49:00.189209Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"6a730db4ef21\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"4.41%\", \"created\": \"2026-03-20T09:49:00.288078Z\", \"daemon_id\": \"np0005652759\", \"daemon_name\": \"mon.np0005652759\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652759.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.428484Z\", \"memory_request\": 2147483648, \"memory_usage\": 43799019, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:49:00.189209Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005652755.localdomain] => { "msg": "Migrate mon: np0005652756.localdomain to node: np0005652760.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.104"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.668254", "end": "2026-03-20 09:49:32.995355", "msg": "", "rc": 0, "start": "2026-03-20 09:49:32.327101", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":42,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":27,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":87,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":630112256,\"bytes_avail\":44441878528,\"bytes_total\":45071990784,\"read_bytes_sec\":19623,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":77,\"modified\":\"2026-03-20T09:49:02.382880+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":42,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005652757\",\"np0005652756\",\"np0005652761\",\"np0005652760\",\"np0005652759\"],\"quorum_age\":27,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":87,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":630112256,\"bytes_avail\":44441878528,\"bytes_total\":45071990784,\"read_bytes_sec\":19623,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":77,\"modified\":\"2026-03-20T09:49:02.382880+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005652755.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"changed": false, "examined": 2, "files": [{"atime": 1774000161.3873453, "ctime": 1774000161.8113582, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1216349538, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1774000161.577351, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 367, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1774000162.7473867, "ctime": 1774000163.1934001, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1216349537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1774000162.958393, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 2, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 367, 'inode': 1216349538, 'dev': 64516, 'nlink': 1, 'atime': 1774000161.3873453, 'mtime': 1774000161.577351, 'ctime': 1774000161.8113582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "d6ac067922ad00c34f8ac15a0a6c0af09a2f7be2", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1774000161.3873453, "ctime": 1774000161.8113582, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1216349538, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1774000161.577351, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 367, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "7966a96c0c920f2abfd307eba54f9693", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 367, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 1216349537, 'dev': 64516, 'nlink': 1, 'atime': 1774000162.7473867, 'mtime': 1774000162.958393, 'ctime': 1774000163.1934001, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "fd9bfb4214b0e14a9ab3b9114f5c6030f9b187ed", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1774000162.7473867, "ctime": 1774000163.1934001, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1216349537, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1774000162.958393, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "00437778ea8eb37cb72a6b283cb52066", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.674333", "end": "2026-03-20 09:49:37.817933", "msg": "", "rc": 0, "start": "2026-03-20 09:49:37.143600", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":24,\"available\":true,\"active_name\":\"np0005652759.zlbibv\",\"num_standby\":4}", "stdout_lines": ["", "{\"epoch\":24,\"available\":true,\"active_name\":\"np0005652759.zlbibv\",\"num_standby\":4}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005652759.zlbibv", "available": true, "epoch": 24, "num_standby": 4}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "mgr.active_name | regex_search(cur_mon | split('.') | first) or mgr.active_name | regex_search(target_node | split('.') | first)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005652755.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652756", "-f", "json"], "delta": "0:00:00.654772", "end": "2026-03-20 09:49:39.388876", "msg": "", "rc": 0, "start": "2026-03-20 09:49:38.734104", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"768498eb47bf\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.09%\", \"created\": \"2026-03-20T07:43:59.515545Z\", \"daemon_id\": \"np0005652756\", \"daemon_name\": \"mon.np0005652756\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.870074Z\", \"memory_request\": 2147483648, \"memory_usage\": 156342681, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:43:59.397074Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"768498eb47bf\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.09%\", \"created\": \"2026-03-20T07:43:59.515545Z\", \"daemon_id\": \"np0005652756\", \"daemon_name\": \"mon.np0005652756\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652756.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.870074Z\", \"memory_request\": 2147483648, \"memory_usage\": 156342681, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:43:59.397074Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652756", "--force"], "delta": "0:00:02.822046", "end": "2026-03-20 09:49:43.036331", "msg": "", "rc": 0, "start": "2026-03-20 09:49:40.214285", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652756 from host 'np0005652756.localdomain'", "stdout_lines": ["Removed mon.np0005652756 from host 'np0005652756.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005652755.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652756.localdomain", "mon"], "delta": "0:00:00.742421", "end": "2026-03-20 09:49:44.639659", "item": ["np0005652756.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:49:43.897238", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005652756.localdomain", "stdout_lines": ["Removed label mon from host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652756.localdomain", "mgr"], "delta": "0:00:00.713712", "end": "2026-03-20 09:49:45.898948", "item": ["np0005652756.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:49:45.185236", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005652756.localdomain", "stdout_lines": ["Removed label mgr from host np0005652756.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652756.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652756.localdomain", "_admin"], "delta": "0:00:00.750932", "end": "2026-03-20 09:49:47.192265", "item": ["np0005652756.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:49:46.441333", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005652756.localdomain", "stdout_lines": ["Removed label _admin from host np0005652756.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:49:47.337776", "stderr": "", "stdout": "Paused for 10.02 seconds", "stop": "2026-03-20 09:49:57.353154", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005652756.localdomain"], "delta": "0:00:03.777240", "end": "2026-03-20 09:50:01.782374", "msg": "", "rc": 0, "start": "2026-03-20 09:49:58.005134", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005652756.localdomain'\ntype id \n-------------------- ---------------\ncrash np0005652756 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005652756.localdomain'", "type id ", "-------------------- ---------------", "crash np0005652756 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005652756.localdomain", "-f", "json"], "delta": "0:00:00.701275", "end": "2026-03-20 09:50:03.201334", "msg": "", "rc": 0, "start": "2026-03-20 09:50:02.500059", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.104\", \"hostname\": \"np0005652756.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.104\", \"hostname\": \"np0005652756.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005652756.localdomain", "--force"], "delta": "0:00:00.698863", "end": "2026-03-20 09:50:04.547488", "msg": "", "rc": 0, "start": "2026-03-20 09:50:03.848625", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005652756.localdomain'", "stdout_lines": ["Removed host 'np0005652756.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005652755.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.470183.2026-03-20@09:50:05~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005652755.localdomain -> np0005652756.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.104/24", "dev", "vlan21"], "delta": "0:00:00.005345", "end": "2026-03-20 09:50:06.168749", "msg": "", "rc": 0, "start": "2026-03-20 09:50:06.163404", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005652755.localdomain -> np0005652760.localdomain(192.168.122.107)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.292233.2026-03-20@09:50:07~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005652755.localdomain -> np0005652760.localdomain(192.168.122.107)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.104/24", "dev", "vlan21"], "delta": "0:00:00.004155", "end": "2026-03-20 09:50:08.222252", "msg": "", "rc": 0, "start": "2026-03-20 09:50:08.218097", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005652755.localdomain -> np0005652760.localdomain(192.168.122.107)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.104"], "delta": "0:00:02.059920", "end": "2026-03-20 09:50:11.075814", "msg": "", "rc": 0, "start": "2026-03-20 09:50:09.015894", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.104 (172.18.0.104) 56(84) bytes of data.\n64 bytes from 172.18.0.104: icmp_seq=1 ttl=64 time=0.069 ms\n64 bytes from 172.18.0.104: icmp_seq=2 ttl=64 time=0.066 ms\n64 bytes from 172.18.0.104: icmp_seq=3 ttl=64 time=0.086 ms\n\n--- 172.18.0.104 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2051ms\nrtt min/avg/max/mdev = 0.066/0.073/0.086/0.008 ms", "stdout_lines": ["PING 172.18.0.104 (172.18.0.104) 56(84) bytes of data.", "64 bytes from 172.18.0.104: icmp_seq=1 ttl=64 time=0.069 ms", "64 bytes from 172.18.0.104: icmp_seq=2 ttl=64 time=0.066 ms", "64 bytes from 172.18.0.104: icmp_seq=3 ttl=64 time=0.086 ms", "", "--- 172.18.0.104 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2051ms", "rtt min/avg/max/mdev = 0.066/0.073/0.086/0.008 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.675741", "end": "2026-03-20 09:50:12.375076", "rc": 0, "start": "2026-03-20 09:50:11.699335", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652760", "-f", "json"], "delta": "0:00:00.708418", "end": "2026-03-20 09:50:13.701321", "msg": "", "rc": 0, "start": "2026-03-20 09:50:12.992903", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"314789bdcc03\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.23%\", \"created\": \"2026-03-20T09:47:30.847884Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:49:38.691500Z daemon:mon.np0005652760 [INFO] \\\"Reconfigured mon.np0005652760 on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.896334Z\", \"memory_request\": 2147483648, \"memory_usage\": 41827696, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:30.748114Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"314789bdcc03\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.23%\", \"created\": \"2026-03-20T09:47:30.847884Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:49:38.691500Z daemon:mon.np0005652760 [INFO] \\\"Reconfigured mon.np0005652760 on host 'np0005652760.localdomain'\\\"\"], \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:49:17.896334Z\", \"memory_request\": 2147483648, \"memory_usage\": 41827696, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:30.748114Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652760", "--force"], "delta": "0:00:05.869494", "end": "2026-03-20 09:50:20.211023", "msg": "", "rc": 0, "start": "2026-03-20 09:50:14.341529", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652760 from host 'np0005652760.localdomain'", "stdout_lines": ["Removed mon.np0005652760 from host 'np0005652760.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:50:20.374561", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:50:30.388389", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005652760.localdomain] *********** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005652760.localdomain] *********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005652760.localdomain:172.18.0.104"], "delta": "0:00:03.700928", "end": "2026-03-20 09:50:34.718654", "msg": "", "rc": 0, "start": "2026-03-20 09:50:31.017726", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005652760 on host 'np0005652760.localdomain'", "stdout_lines": ["Deployed mon.np0005652760 on host 'np0005652760.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:50:34.851631", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:50:44.864735", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.787704", "end": "2026-03-20 09:50:46.196429", "msg": "", "rc": 0, "start": "2026-03-20 09:50:45.408725", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":62,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005652757\",\"np0005652761\",\"np0005652759\",\"np0005652760\"],\"quorum_age\":3,\"monmap\":{\"epoch\":14,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":87,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":630112256,\"bytes_avail\":44441878528,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":4,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":81,\"modified\":\"2026-03-20T09:50:24.066551+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":62,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005652757\",\"np0005652761\",\"np0005652759\",\"np0005652760\"],\"quorum_age\":3,\"monmap\":{\"epoch\":14,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":87,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":630112256,\"bytes_avail\":44441878528,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":4,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":81,\"modified\":\"2026-03-20T09:50:24.066551+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:00.912521", "end": "2026-03-20 09:50:47.795594", "msg": "", "rc": 0, "start": "2026-03-20 09:50:46.883073", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.5 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.1 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.4 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.0 on host 'np0005652761.localdomain'\nScheduled to reconfig osd.3 on host 'np0005652761.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005652761.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005652761.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.783994", "end": "2026-03-20 09:50:49.312177", "msg": "", "rc": 0, "start": "2026-03-20 09:50:48.528183", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:50:49.463539", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:50:59.478408", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005652755.localdomain: jid=j32725019604.490362 changed: [np0005652755.localdomain] => {"ansible_job_id": "j32725019604.490362", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.723315", "end": "2026-03-20 09:51:00.918619", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j32725019604.490362", "start": "2026-03-20 09:51:00.195304", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.844215", "end": "2026-03-20 09:51:03.509531", "rc": 0, "start": "2026-03-20 09:51:02.665316", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652760", "-f", "json"], "delta": "0:00:00.697387", "end": "2026-03-20 09:51:04.896415", "msg": "", "rc": 0, "start": "2026-03-20 09:51:04.199028", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"9cc205e1c429\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.99%\", \"created\": \"2026-03-20T09:50:34.514431Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:50:51.550729Z\", \"memory_request\": 2147483648, \"memory_usage\": 53865349, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:50:34.410290Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"9cc205e1c429\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.99%\", \"created\": \"2026-03-20T09:50:34.514431Z\", \"daemon_id\": \"np0005652760\", \"daemon_name\": \"mon.np0005652760\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652760.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:50:51.550729Z\", \"memory_request\": 2147483648, \"memory_usage\": 53865349, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:50:34.410290Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005652755.localdomain] => { "msg": "Migrate mon: np0005652757.localdomain to node: np0005652761.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.105"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.697395", "end": "2026-03-20 09:51:06.448699", "msg": "", "rc": 0, "start": "2026-03-20 09:51:05.751304", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":62,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005652757\",\"np0005652761\",\"np0005652759\",\"np0005652760\"],\"quorum_age\":23,\"monmap\":{\"epoch\":14,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":88,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574332,\"bytes_used\":634372096,\"bytes_avail\":44437618688,\"bytes_total\":45071990784,\"read_bytes_sec\":19354,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":4,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":81,\"modified\":\"2026-03-20T09:50:24.066551+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":62,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005652757\",\"np0005652761\",\"np0005652759\",\"np0005652760\"],\"quorum_age\":23,\"monmap\":{\"epoch\":14,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":88,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574332,\"bytes_used\":634372096,\"bytes_avail\":44437618688,\"bytes_total\":45071990784,\"read_bytes_sec\":19354,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":4,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":81,\"modified\":\"2026-03-20T09:50:24.066551+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005652755.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"changed": false, "examined": 2, "files": [{"atime": 1774000253.39602, "ctime": 1774000253.8470337, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1275162576, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1774000253.594026, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 319, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1774000254.7360606, "ctime": 1774000255.179074, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1275069798, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1774000254.912066, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 2, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 319, 'inode': 1275162576, 'dev': 64516, 'nlink': 1, 'atime': 1774000253.39602, 'mtime': 1774000253.594026, 'ctime': 1774000253.8470337, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "916fff1ce145890b8b78cdace3b878fb03514de6", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1774000253.39602, "ctime": 1774000253.8470337, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1275162576, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1774000253.594026, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 319, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "da32bcc7a163f9420b91e15be1742490", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 319, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 1275069798, 'dev': 64516, 'nlink': 1, 'atime': 1774000254.7360606, 'mtime': 1774000254.912066, 'ctime': 1774000255.179074, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "fd9bfb4214b0e14a9ab3b9114f5c6030f9b187ed", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1774000254.7360606, "ctime": 1774000255.179074, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1275069798, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1774000254.912066, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "00437778ea8eb37cb72a6b283cb52066", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.780879", "end": "2026-03-20 09:51:11.169608", "msg": "", "rc": 0, "start": "2026-03-20 09:51:10.388729", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":30,\"available\":true,\"active_name\":\"np0005652761.zmdusi\",\"num_standby\":4}", "stdout_lines": ["", "{\"epoch\":30,\"available\":true,\"active_name\":\"np0005652761.zmdusi\",\"num_standby\":4}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005652761.zmdusi", "available": true, "epoch": 30, "num_standby": 4}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.820092", "end": "2026-03-20 09:51:12.889059", "msg": "", "rc": 0, "start": "2026-03-20 09:51:12.068967", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:51:13.002943", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:51:23.015384", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j705952690007.491610 started=1 finished=0 ASYNC OK on np0005652755.localdomain: jid=j705952690007.491610 changed: [np0005652755.localdomain] => {"ansible_job_id": "j705952690007.491610", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:23.957867", "end": "2026-03-20 09:51:47.769707", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j705952690007.491610", "start": "2026-03-20 09:51:23.811840", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005652755.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652757", "-f", "json"], "delta": "0:00:00.822469", "end": "2026-03-20 09:51:50.196279", "msg": "", "rc": 0, "start": "2026-03-20 09:51:49.373810", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"28fd234b0571\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.19%\", \"created\": \"2026-03-20T07:43:57.303694Z\", \"daemon_id\": \"np0005652757\", \"daemon_name\": \"mon.np0005652757\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652757.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:51:48.639790Z\", \"memory_request\": 2147483648, \"memory_usage\": 171022745, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:43:57.186265Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"28fd234b0571\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.19%\", \"created\": \"2026-03-20T07:43:57.303694Z\", \"daemon_id\": \"np0005652757\", \"daemon_name\": \"mon.np0005652757\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652757.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:51:48.639790Z\", \"memory_request\": 2147483648, \"memory_usage\": 171022745, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T07:43:57.186265Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652757", "--force"], "delta": "0:00:02.312863", "end": "2026-03-20 09:51:53.336124", "msg": "", "rc": 0, "start": "2026-03-20 09:51:51.023261", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652757 from host 'np0005652757.localdomain'", "stdout_lines": ["Removed mon.np0005652757 from host 'np0005652757.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005652755.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652757.localdomain", "mon"], "delta": "0:00:04.020379", "end": "2026-03-20 09:51:58.176592", "item": ["np0005652757.localdomain", "mon"], "msg": "", "rc": 0, "start": "2026-03-20 09:51:54.156213", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005652757.localdomain", "stdout_lines": ["Removed label mon from host np0005652757.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652757.localdomain", "mgr"], "delta": "0:00:00.721486", "end": "2026-03-20 09:51:59.463439", "item": ["np0005652757.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2026-03-20 09:51:58.741953", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005652757.localdomain", "stdout_lines": ["Removed label mgr from host np0005652757.localdomain"]} changed: [np0005652755.localdomain] => (item=['np0005652757.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005652757.localdomain", "_admin"], "delta": "0:00:00.676027", "end": "2026-03-20 09:52:00.799650", "item": ["np0005652757.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2026-03-20 09:52:00.123623", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005652757.localdomain", "stdout_lines": ["Removed label _admin from host np0005652757.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:52:00.948342", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:52:10.962072", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005652757.localdomain"], "delta": "0:00:00.731593", "end": "2026-03-20 09:52:12.430805", "msg": "", "rc": 0, "start": "2026-03-20 09:52:11.699212", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005652757.localdomain'\ntype id \n-------------------- ---------------\nmgr np0005652757.puuyvp\ncrash np0005652757 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005652757.localdomain'", "type id ", "-------------------- ---------------", "mgr np0005652757.puuyvp", "crash np0005652757 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005652757.localdomain", "-f", "json"], "delta": "0:00:00.732430", "end": "2026-03-20 09:52:13.850268", "msg": "", "rc": 0, "start": "2026-03-20 09:52:13.117838", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.105\", \"hostname\": \"np0005652757.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.105\", \"hostname\": \"np0005652757.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005652757.localdomain", "--force"], "delta": "0:00:00.715573", "end": "2026-03-20 09:52:15.180578", "msg": "", "rc": 0, "start": "2026-03-20 09:52:14.465005", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005652757.localdomain'", "stdout_lines": ["Removed host 'np0005652757.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005652755.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.472139.2026-03-20@09:52:16~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005652755.localdomain -> np0005652757.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.105/24", "dev", "vlan21"], "delta": "0:00:00.006728", "end": "2026-03-20 09:52:16.963019", "msg": "", "rc": 0, "start": "2026-03-20 09:52:16.956291", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005652755.localdomain -> np0005652761.localdomain(192.168.122.108)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.295546.2026-03-20@09:52:18~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005652755.localdomain -> np0005652761.localdomain(192.168.122.108)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.105/24", "dev", "vlan21"], "delta": "0:00:00.003884", "end": "2026-03-20 09:52:19.110374", "msg": "", "rc": 0, "start": "2026-03-20 09:52:19.106490", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005652755.localdomain -> np0005652761.localdomain(192.168.122.108)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.105"], "delta": "0:00:02.034235", "end": "2026-03-20 09:52:22.872798", "msg": "", "rc": 0, "start": "2026-03-20 09:52:20.838563", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.105 (172.18.0.105) 56(84) bytes of data.\n64 bytes from 172.18.0.105: icmp_seq=1 ttl=64 time=0.071 ms\n64 bytes from 172.18.0.105: icmp_seq=2 ttl=64 time=0.058 ms\n64 bytes from 172.18.0.105: icmp_seq=3 ttl=64 time=0.105 ms\n\n--- 172.18.0.105 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2027ms\nrtt min/avg/max/mdev = 0.058/0.078/0.105/0.019 ms", "stdout_lines": ["PING 172.18.0.105 (172.18.0.105) 56(84) bytes of data.", "64 bytes from 172.18.0.105: icmp_seq=1 ttl=64 time=0.071 ms", "64 bytes from 172.18.0.105: icmp_seq=2 ttl=64 time=0.058 ms", "64 bytes from 172.18.0.105: icmp_seq=3 ttl=64 time=0.105 ms", "", "--- 172.18.0.105 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2027ms", "rtt min/avg/max/mdev = 0.058/0.078/0.105/0.019 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.722900", "end": "2026-03-20 09:52:24.277255", "rc": 0, "start": "2026-03-20 09:52:23.554355", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652761", "-f", "json"], "delta": "0:00:00.682270", "end": "2026-03-20 09:52:25.630566", "msg": "", "rc": 0, "start": "2026-03-20 09:52:24.948296", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"81878b4169a8\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.22%\", \"created\": \"2026-03-20T09:47:28.092398Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:52:17.828508Z daemon:mon.np0005652761 [INFO] \\\"Reconfigured mon.np0005652761 on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:51:48.917766Z\", \"memory_request\": 2147483648, \"memory_usage\": 69226987, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:27.985362Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"81878b4169a8\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.22%\", \"created\": \"2026-03-20T09:47:28.092398Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"events\": [\"2026-03-20T09:52:17.828508Z daemon:mon.np0005652761 [INFO] \\\"Reconfigured mon.np0005652761 on host 'np0005652761.localdomain'\\\"\"], \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:51:48.917766Z\", \"memory_request\": 2147483648, \"memory_usage\": 69226987, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:47:27.985362Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005652761", "--force"], "delta": "0:00:02.477213", "end": "2026-03-20 09:52:28.712299", "msg": "", "rc": 0, "start": "2026-03-20 09:52:26.235086", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005652761 from host 'np0005652761.localdomain'", "stdout_lines": ["Removed mon.np0005652761 from host 'np0005652761.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:52:28.831595", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:52:38.837408", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005652761.localdomain] *********** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005652761.localdomain] *********** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005652761.localdomain:172.18.0.105"], "delta": "0:00:03.606143", "end": "2026-03-20 09:52:43.031815", "msg": "", "rc": 0, "start": "2026-03-20 09:52:39.425672", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005652761 on host 'np0005652761.localdomain'", "stdout_lines": ["Deployed mon.np0005652761 on host 'np0005652761.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:52:43.165384", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:52:53.179087", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.786599", "end": "2026-03-20 09:52:54.464020", "msg": "", "rc": 0, "start": "2026-03-20 09:52:53.677421", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":74,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652759\",\"np0005652760\",\"np0005652761\"],\"quorum_age\":5,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":90,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109575456,\"bytes_used\":634535936,\"bytes_avail\":44437454848,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":3,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2026-03-20T09:52:48.565851+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":74,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652759\",\"np0005652760\",\"np0005652761\"],\"quorum_age\":5,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":90,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109575456,\"bytes_used\":634535936,\"bytes_avail\":44437454848,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":3,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2026-03-20T09:52:48.565851+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:03.932765", "end": "2026-03-20 09:52:59.106130", "msg": "", "rc": 0, "start": "2026-03-20 09:52:55.173365", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.5 on host 'np0005652759.localdomain'\nScheduled to reconfig osd.1 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.4 on host 'np0005652760.localdomain'\nScheduled to reconfig osd.0 on host 'np0005652761.localdomain'\nScheduled to reconfig osd.3 on host 'np0005652761.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005652759.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005652760.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005652761.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005652761.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005652755.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005652755.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.692246", "end": "2026-03-20 09:53:00.568984", "msg": "", "rc": 0, "start": "2026-03-20 09:52:59.876738", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2026-03-20 09:53:00.671086", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2026-03-20 09:53:10.683209", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC POLL on np0005652755.localdomain: jid=j682466082414.495523 started=1 finished=0 ASYNC OK on np0005652755.localdomain: jid=j682466082414.495523 changed: [np0005652755.localdomain] => {"ansible_job_id": "j682466082414.495523", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:25.297233", "end": "2026-03-20 09:53:36.736298", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j682466082414.495523", "start": "2026-03-20 09:53:11.439065", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005652755.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.681862", "end": "2026-03-20 09:53:39.407636", "rc": 0, "start": "2026-03-20 09:53:38.725774", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005652755.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005652755.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005652761", "-f", "json"], "delta": "0:00:00.742589", "end": "2026-03-20 09:53:40.835228", "msg": "", "rc": 0, "start": "2026-03-20 09:53:40.092639", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"27e7d4dc771c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.72%\", \"created\": \"2026-03-20T09:52:42.787629Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:53:37.953755Z\", \"memory_request\": 2147483648, \"memory_usage\": 42834329, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:52:42.681465Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"27e7d4dc771c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:485411749726179fe5cd880e2cf308261b35150e4b356ddb7100f52e02b2e353\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\"], \"container_image_id\": \"957884a578836762598b9ee0d341c2f25fc7f75b4c9e8a2eedc704f3f8efac7d\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.72%\", \"created\": \"2026-03-20T09:52:42.787629Z\", \"daemon_id\": \"np0005652761\", \"daemon_name\": \"mon.np0005652761\", \"daemon_type\": \"mon\", \"hostname\": \"np0005652761.localdomain\", \"is_active\": false, \"last_refresh\": \"2026-03-20T09:53:37.953755Z\", \"memory_request\": 2147483648, \"memory_usage\": 42834329, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2026-03-20T09:52:42.681465Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-381.el9cp\"}]"]} TASK [ceph_migrate : Sleep before moving to the next mon] ********************** Pausing for 30 seconds ok: [np0005652755.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2026-03-20 09:53:41.000614", "stderr": "", "stdout": "Paused for 30.03 seconds", "stop": "2026-03-20 09:54:11.028361", "user_input": ""} TASK [ceph_migrate : POST - Dump logs] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_load.yaml for np0005652755.localdomain TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005652755.localdomain] => {"changed": false, "examined": 4, "files": [{"atime": 1774000070.7671843, "ctime": 1774000070.539178, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 360782165, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1774000017.5507681, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 142, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1774000070.7731843, "ctime": 1774000070.539178, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 251855659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773992551.9966078, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773993842.578621, "ctime": 1774000070.539178, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 251855660, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993840.3125603, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.openstack.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773993843.740652, "ctime": 1774000070.539178, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 251855661, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773993841.2205846, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.manila.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Restore files] ******************************************** changed: [np0005652755.localdomain] => (item=ceph.conf) => {"ansible_loop_var": "item", "changed": true, "checksum": "e4b8377a1e31f6e329c10fe03dac85aea504386f", "dest": "/etc/ceph/ceph.conf", "gid": 0, "group": "root", "item": "ceph.conf", "md5sum": "21d4fb0ad6afd5c3bbd80853e6e7caaf", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 142, "src": "/home/tripleo-admin/ceph_client/ceph.conf", "state": "file", "uid": 0} changed: [np0005652755.localdomain] => (item=ceph.client.admin.keyring) => {"ansible_loop_var": "item", "changed": true, "checksum": "fd9bfb4214b0e14a9ab3b9114f5c6030f9b187ed", "dest": "/etc/ceph/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": "ceph.client.admin.keyring", "md5sum": "00437778ea8eb37cb72a6b283cb52066", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 151, "src": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005652755.localdomain] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client/logs", "secontext": "unconfined_u:object_r:container_file_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Get Ceph Health] ****************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "-s", "-f", "json"], "delta": "0:00:04.574931", "end": "2026-03-20 09:54:18.604761", "msg": "", "rc": 0, "start": "2026-03-20 09:54:14.029830", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":74,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652759\",\"np0005652760\",\"np0005652761\"],\"quorum_age\":89,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":92,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109576580,\"bytes_used\":634679296,\"bytes_avail\":44437311488,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2026-03-20T09:52:48.565851+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":74,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005652759\",\"np0005652760\",\"np0005652761\"],\"quorum_age\":89,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":92,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1773992691,\"num_in_osds\":6,\"osd_in_since\":1773992670,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109576580,\"bytes_used\":634679296,\"bytes_avail\":44437311488,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005652760.wqlirt\",\"status\":\"up:active\",\"gid\":26492}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2026-03-20T09:52:48.565851+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Load ceph data] ******************************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"ceph": {"election_epoch": 74, "fsid": "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "fsmap": {"by_rank": [{"filesystem_id": 1, "gid": 26492, "name": "mds.np0005652760.wqlirt", "rank": 0, "status": "up:active"}], "epoch": 16, "id": 1, "in": 1, "max": 1, "up": 1, "up:standby": 2}, "health": {"checks": {}, "mutes": [], "status": "HEALTH_OK"}, "mgrmap": {"available": true, "modules": ["cephadm", "iostat", "nfs", "restful"], "num_standbys": 2, "services": {}}, "monmap": {"epoch": 17, "min_mon_release_name": "reef", "num_mons": 3}, "osdmap": {"epoch": 92, "num_in_osds": 6, "num_osds": 6, "num_remapped_pgs": 0, "num_up_osds": 6, "osd_in_since": 1773992670, "osd_up_since": 1773992691}, "pgmap": {"bytes_avail": 44437311488, "bytes_total": 45071990784, "bytes_used": 634679296, "data_bytes": 109576580, "num_objects": 69, "num_pgs": 177, "num_pools": 7, "pgs_by_state": [{"count": 177, "state_name": "active+clean"}]}, "progress_events": {}, "quorum": [0, 1, 2], "quorum_age": 89, "quorum_names": ["np0005652759", "np0005652760", "np0005652761"], "servicemap": {"epoch": 84, "modified": "2026-03-20T09:52:48.565851+0000", "services": {}}}}, "changed": false} TASK [ceph_migrate : Dump ceph -s output to log file] ************************** changed: [np0005652755.localdomain] => {"changed": true, "checksum": "cbd22578ac65b6efababaeaccc669ae5f6164231", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_health.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "6dcb20b672a8cd92c3c3d9037fdd4346", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1125, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000458.7747233-63213-274400426686262/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph Orch ServiceMap] ********************************* changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "ls", "-f", "json"], "delta": "0:00:05.019991", "end": "2026-03-20 09:54:25.054085", "msg": "", "rc": 0, "start": "2026-03-20 09:54:20.034094", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2026-03-20T07:42:24.149793Z\", \"last_refresh\": \"2026-03-20T09:53:37.517888Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mds\"}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2026-03-20T09:45:34.645589Z\", \"last_refresh\": \"2026-03-20T09:53:37.518101Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mgr\"}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2026-03-20T09:47:03.627105Z\", \"last_refresh\": \"2026-03-20T09:53:37.518159Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T09:53:42.639367Z service:mon [INFO] \\\"service was created\\\"\"], \"placement\": {\"label\": \"mon\"}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2026-03-20T09:53:39.286401Z\", \"last_refresh\": \"2026-03-20T09:53:37.518216Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2026-03-20T07:42:38.049883Z\", \"running\": 0, \"size\": 0}}, {\"placement\": {\"hosts\": [\"np0005652759.localdomain\", \"np0005652760.localdomain\", \"np0005652761.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2026-03-20T07:43:06.026205Z\", \"last_refresh\": \"2026-03-20T09:53:37.517981Z\", \"running\": 6, \"size\": 6}}]", "stdout_lines": ["", "[{\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2026-03-20T07:42:24.149793Z\", \"last_refresh\": \"2026-03-20T09:53:37.517888Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mds\"}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2026-03-20T09:45:34.645589Z\", \"last_refresh\": \"2026-03-20T09:53:37.518101Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mgr\"}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2026-03-20T09:47:03.627105Z\", \"last_refresh\": \"2026-03-20T09:53:37.518159Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2026-03-20T09:53:42.639367Z service:mon [INFO] \\\"service was created\\\"\"], \"placement\": {\"label\": \"mon\"}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2026-03-20T09:53:39.286401Z\", \"last_refresh\": \"2026-03-20T09:53:37.518216Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2026-03-20T07:42:38.049883Z\", \"running\": 0, \"size\": 0}}, {\"placement\": {\"hosts\": [\"np0005652759.localdomain\", \"np0005652760.localdomain\", \"np0005652761.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2026-03-20T07:43:06.026205Z\", \"last_refresh\": \"2026-03-20T09:53:37.517981Z\", \"running\": 6, \"size\": 6}}]"]} TASK [ceph_migrate : Load Service Map] ***************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"servicemap": [{"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2026-03-20T07:42:24.149793Z", "last_refresh": "2026-03-20T09:53:37.517888Z", "running": 3, "size": 3}}, {"placement": {"label": "mds"}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2026-03-20T09:45:34.645589Z", "last_refresh": "2026-03-20T09:53:37.518101Z", "running": 3, "size": 3}}, {"placement": {"label": "mgr"}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2026-03-20T09:47:03.627105Z", "last_refresh": "2026-03-20T09:53:37.518159Z", "running": 3, "size": 3}}, {"events": ["2026-03-20T09:53:42.639367Z service:mon [INFO] \"service was created\""], "placement": {"label": "mon"}, "service_name": "mon", "service_type": "mon", "status": {"created": "2026-03-20T09:53:39.286401Z", "last_refresh": "2026-03-20T09:53:37.518216Z", "running": 3, "size": 3}}, {"placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2026-03-20T07:42:38.049883Z", "running": 0, "size": 0}}, {"placement": {"hosts": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2026-03-20T07:43:06.026205Z", "last_refresh": "2026-03-20T09:53:37.517981Z", "running": 6, "size": 6}}]}, "changed": false} TASK [ceph_migrate : Print Service Map] **************************************** skipping: [np0005652755.localdomain] => (item={'placement': {'host_pattern': '*'}, 'service_name': 'crash', 'service_type': 'crash', 'status': {'created': '2026-03-20T07:42:24.149793Z', 'last_refresh': '2026-03-20T09:53:37.517888Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2026-03-20T07:42:24.149793Z", "last_refresh": "2026-03-20T09:53:37.517888Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'placement': {'label': 'mds'}, 'service_id': 'mds', 'service_name': 'mds.mds', 'service_type': 'mds', 'status': {'created': '2026-03-20T09:45:34.645589Z', 'last_refresh': '2026-03-20T09:53:37.518101Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"label": "mds"}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2026-03-20T09:45:34.645589Z", "last_refresh": "2026-03-20T09:53:37.518101Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'placement': {'label': 'mgr'}, 'service_name': 'mgr', 'service_type': 'mgr', 'status': {'created': '2026-03-20T09:47:03.627105Z', 'last_refresh': '2026-03-20T09:53:37.518159Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"label": "mgr"}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2026-03-20T09:47:03.627105Z", "last_refresh": "2026-03-20T09:53:37.518159Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'events': ['2026-03-20T09:53:42.639367Z service:mon [INFO] "service was created"'], 'placement': {'label': 'mon'}, 'service_name': 'mon', 'service_type': 'mon', 'status': {'created': '2026-03-20T09:53:39.286401Z', 'last_refresh': '2026-03-20T09:53:37.518216Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2026-03-20T09:53:42.639367Z service:mon [INFO] \"service was created\""], "placement": {"label": "mon"}, "service_name": "mon", "service_type": "mon", "status": {"created": "2026-03-20T09:53:39.286401Z", "last_refresh": "2026-03-20T09:53:37.518216Z", "running": 3, "size": 3}}} skipping: [np0005652755.localdomain] => (item={'placement': {'host_pattern': '*'}, 'service_name': 'node-proxy', 'service_type': 'node-proxy', 'status': {'created': '2026-03-20T07:42:38.049883Z', 'running': 0, 'size': 0}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2026-03-20T07:42:38.049883Z", "running": 0, "size": 0}}} skipping: [np0005652755.localdomain] => (item={'placement': {'hosts': ['np0005652759.localdomain', 'np0005652760.localdomain', 'np0005652761.localdomain']}, 'service_id': 'default_drive_group', 'service_name': 'osd.default_drive_group', 'service_type': 'osd', 'spec': {'data_devices': {'paths': ['/dev/ceph_vg0/ceph_lv0', '/dev/ceph_vg1/ceph_lv1']}, 'filter_logic': 'AND', 'objectstore': 'bluestore'}, 'status': {'created': '2026-03-20T07:43:06.026205Z', 'last_refresh': '2026-03-20T09:53:37.517981Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"hosts": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2026-03-20T07:43:06.026205Z", "last_refresh": "2026-03-20T09:53:37.517981Z", "running": 6, "size": 6}}} skipping: [np0005652755.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch ls output to log file] ********************* changed: [np0005652755.localdomain] => {"changed": true, "checksum": "79cfbcd10ea2ed44c93b887420f924c4b60b47e9", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_orch_ls.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "5991b6b9940eb9c8188a09490a9187a1", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1600, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000465.3280716-63242-240829499363861/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph config] ****************************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "config", "dump", "-f", "json"], "delta": "0:00:04.443828", "end": "2026-03-20 09:54:31.247001", "msg": "", "rc": 0, "start": "2026-03-20 09:54:26.803173", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_haproxy\",\"value\":\"registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_keepalived\",\"value\":\"registry.redhat.io/rhceph/keepalived-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652759\",\"location_type\":\"host\",\"location_value\":\"np0005652759\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652760\",\"location_type\":\"host\",\"location_value\":\"np0005652760\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652761\",\"location_type\":\"host\",\"location_value\":\"np0005652761\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.np0005652760.wqlirt\",\"name\":\"mds_join_fs\",\"value\":\"cephfs\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]", "stdout_lines": ["", "[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_haproxy\",\"value\":\"registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_keepalived\",\"value\":\"registry.redhat.io/rhceph/keepalived-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652759\",\"location_type\":\"host\",\"location_value\":\"np0005652759\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652760\",\"location_type\":\"host\",\"location_value\":\"np0005652760\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005652761\",\"location_type\":\"host\",\"location_value\":\"np0005652761\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.np0005652760.wqlirt\",\"name\":\"mds_join_fs\",\"value\":\"cephfs\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]"]} TASK [ceph_migrate : Print Ceph config dump] *********************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph config dump output to log file] ***************** changed: [np0005652755.localdomain] => {"changed": true, "checksum": "f2f231bbe242e9d5982e57f0ee1def38d341d38e", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_config_dump.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "568ac885b82121aca954ab4bec6dde21", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 3044, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000471.507605-63269-169017287050921/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph Orch Host Map] *********************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "host", "ls", "-f", "json"], "delta": "0:00:04.739087", "end": "2026-03-20 09:54:37.792214", "msg": "", "rc": 0, "start": "2026-03-20 09:54:33.053127", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3"], "stdout": "\n[{\"addr\": \"192.168.122.106\", \"hostname\": \"np0005652759.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005652760.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005652761.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.106\", \"hostname\": \"np0005652759.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005652760.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005652761.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}]"]} TASK [ceph_migrate : Load nodes] *********************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"nds": [{"addr": "192.168.122.106", "hostname": "np0005652759.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}, {"addr": "192.168.122.107", "hostname": "np0005652760.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}, {"addr": "192.168.122.108", "hostname": "np0005652761.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}]}, "changed": false} TASK [ceph_migrate : Load hostmap List] **************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"hostmap": {"np0005652759.localdomain": ["osd", "mds", "mgr", "mon", "_admin"], "np0005652760.localdomain": ["osd", "mds", "mgr", "mon", "_admin"], "np0005652761.localdomain": ["osd", "mds", "mgr", "mon", "_admin"]}}, "changed": false} TASK [ceph_migrate : Print Host Map] ******************************************* skipping: [np0005652755.localdomain] => (item=np0005652759.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652759.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652760.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652760.localdomain"} skipping: [np0005652755.localdomain] => (item=np0005652761.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005652761.localdomain"} skipping: [np0005652755.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch host ls output to log file] **************** changed: [np0005652755.localdomain] => {"changed": true, "checksum": "1a2347748dfa127ea96264e4df8a020ea5f0e4f3", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_orch_host_ls.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "ea040e5f042420f04ec0e102d6ebd816", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 84, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000478.1124103-63302-48879568679688/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph monmap and load data] **************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "mon", "dump", "-f", "json"], "delta": "0:00:04.676024", "end": "2026-03-20 09:54:44.348430", "msg": "", "rc": 0, "start": "2026-03-20 09:54:39.672406", "stderr": "Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3\ndumped monmap epoch 17", "stderr_lines": ["Inferring fsid 39ff5591-b969-58ac-89fa-bf85e4fa1d90", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '957884a57883' and tag 'latest' created on 2026-02-09 10:26:08 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:50945286dae5941044aa91c7700ff058ab2cb308d5d1d6d6bb2daf28aa7a0ca3", "dumped monmap epoch 17"], "stdout": "\n{\"epoch\":17,\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"modified\":\"2026-03-20T09:52:43.415072Z\",\"created\":\"2026-03-20T07:41:47.261900Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005652759\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005652760\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005652761\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}", "stdout_lines": ["", "{\"epoch\":17,\"fsid\":\"39ff5591-b969-58ac-89fa-bf85e4fa1d90\",\"modified\":\"2026-03-20T09:52:43.415072Z\",\"created\":\"2026-03-20T07:41:47.261900Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005652759\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005652760\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005652761\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}"]} TASK [ceph_migrate : Get Monmap] *********************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"mon_dump": {"created": "2026-03-20T07:41:47.261900Z", "disallowed_leaders: ": "", "election_strategy": 1, "epoch": 17, "features": {"optional": [], "persistent": ["kraken", "luminous", "mimic", "osdmap-prune", "nautilus", "octopus", "pacific", "elector-pinging", "quincy", "reef"]}, "fsid": "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "min_mon_release": 18, "min_mon_release_name": "reef", "modified": "2026-03-20T09:52:43.415072Z", "mons": [{"addr": "172.18.0.103:6789/0", "crush_location": "{}", "name": "np0005652759", "priority": 0, "public_addr": "172.18.0.103:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.103:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.103:6789", "nonce": 0, "type": "v1"}]}, "rank": 0, "weight": 0}, {"addr": "172.18.0.104:6789/0", "crush_location": "{}", "name": "np0005652760", "priority": 0, "public_addr": "172.18.0.104:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.104:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.104:6789", "nonce": 0, "type": "v1"}]}, "rank": 1, "weight": 0}, {"addr": "172.18.0.105:6789/0", "crush_location": "{}", "name": "np0005652761", "priority": 0, "public_addr": "172.18.0.105:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.105:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.105:6789", "nonce": 0, "type": "v1"}]}, "rank": 2, "weight": 0}], "quorum": [0, 1, 2], "removed_ranks: ": "", "stretch_mode": false, "tiebreaker_mon": ""}}, "changed": false} TASK [ceph_migrate : Print monmap] ********************************************* skipping: [np0005652755.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph mon dump output to log file] ******************** changed: [np0005652755.localdomain] => {"changed": true, "checksum": "c9c72d7e75bd0d7edab96f872dc4b18d99c3e3c2", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_mon_dump.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "7df9e6873f0ceab46b9075e4a9640aa2", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1425, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1774000484.6324577-63335-123532551466244/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Load nodes to decommission] ******************************* ok: [np0005652755.localdomain] => {"ansible_facts": {"decomm_nodes": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "changed": false} TASK [ceph_migrate : Load target nodes] **************************************** ok: [np0005652755.localdomain] => {"ansible_facts": {"target_nodes": ["np0005652759.localdomain", "np0005652760.localdomain", "np0005652761.localdomain"]}, "changed": false} TASK [ceph_migrate : Print target nodes] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Print decomm_nodes] *************************************** skipping: [np0005652755.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Configure Swift to use rgw backend] *********************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Flush handlers to ensure mgr restart completes] *********** RUNNING HANDLER [ceph_migrate : restart mgr] *********************************** changed: [np0005652755.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "39ff5591-b969-58ac-89fa-bf85e4fa1d90", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.722900", "end": "2026-03-20 09:54:46.950738", "msg": "", "rc": 0, "start": "2026-03-20 09:54:46.227838", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Install cephadm on all compute nodes] ********************* skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "groups['ComputeHCI'] is defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Force fail ceph mgr on first compute node] **************** skipping: [np0005652755.localdomain] => {"changed": false, "false_condition": "groups['ComputeHCI'] is defined", "skip_reason": "Conditional result was False"} PLAY RECAP ********************************************************************* np0005652755.localdomain : ok=239 changed=111 unreachable=0 failed=0 skipped=143 rescued=0 ignored=0