[WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (edpm_node_hostname). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (edpm_node_ip). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (source_galera_members). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (source_mariadb_ip). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (enable_tlse). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (neutron_qe_test). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (tobiko_qe_test). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (neutron_qe_dir). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (prelaunch_barbican_secret). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (os_cloud_name). Using last defined value only. [WARNING]: While constructing a mapping from /home/zuul/src/review.rdoproject.org/rdo- jobs/playbooks/data_plane_adoption/vars.yaml, line 6, column 1, found a duplicate dict key (standalone_ip). Using last defined value only. Using /home/zuul/src/review.rdoproject.org/rdo-jobs/playbooks/data_plane_adoption/ansible.cfg as config file PLAY [Externalize Ceph] ******************************************************** TASK [Gathering Facts] ********************************************************* ok: [np0005486728.localdomain] TASK [ceph_migrate : Check file in the src directory] ************************** [WARNING]: Skipped '/home/tripleo-admin/ceph_client' path due to this access issue: '/home/tripleo-admin/ceph_client' is not a directory ok: [np0005486728.localdomain] => {"changed": false, "examined": 0, "files": [], "matched": 0, "msg": "Not all paths examined, check warnings for details", "skipped_paths": {"/home/tripleo-admin/ceph_client": "'/home/tripleo-admin/ceph_client' is not a directory"}} TASK [ceph_migrate : Restore files] ******************************************** skipping: [np0005486728.localdomain] => (item=ceph.conf) => {"ansible_loop_var": "item", "changed": false, "false_condition": "dir_ceph_files.files | length > 0", "item": "ceph.conf", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=ceph.client.admin.keyring) => {"ansible_loop_var": "item", "changed": false, "false_condition": "dir_ceph_files.files | length > 0", "item": "ceph.client.admin.keyring", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure backup directory exists] *************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Health] ****************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "-s", "-f", "json"], "delta": "0:00:02.609805", "end": "2025-10-14 09:59:23.262479", "msg": "", "rc": 0, "start": "2025-10-14 09:59:20.652674", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":14,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486728\",\"np0005486730\",\"np0005486729\"],\"quorum_age\":7632,\"monmap\":{\"epoch\":3,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":76,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":573931520,\"bytes_avail\":44498059264,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":6,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486730.hzolgi\",\"status\":\"up:active\",\"gid\":24229}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":68,\"modified\":\"2025-10-14T09:57:57.715578+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":14,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486728\",\"np0005486730\",\"np0005486729\"],\"quorum_age\":7632,\"monmap\":{\"epoch\":3,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":76,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":573931520,\"bytes_avail\":44498059264,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":6,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486730.hzolgi\",\"status\":\"up:active\",\"gid\":24229}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":68,\"modified\":\"2025-10-14T09:57:57.715578+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Load ceph data] ******************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph": {"election_epoch": 14, "fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "fsmap": {"by_rank": [{"filesystem_id": 1, "gid": 24229, "name": "mds.np0005486730.hzolgi", "rank": 0, "status": "up:active"}], "epoch": 6, "id": 1, "in": 1, "max": 1, "up": 1, "up:standby": 2}, "health": {"checks": {}, "mutes": [], "status": "HEALTH_OK"}, "mgrmap": {"available": true, "modules": ["cephadm", "iostat", "nfs", "restful"], "num_standbys": 2, "services": {}}, "monmap": {"epoch": 3, "min_mon_release_name": "reef", "num_mons": 3}, "osdmap": {"epoch": 76, "num_in_osds": 6, "num_osds": 6, "num_remapped_pgs": 0, "num_up_osds": 6, "osd_in_since": 1760428350, "osd_up_since": 1760428371}, "pgmap": {"bytes_avail": 44498059264, "bytes_total": 45071990784, "bytes_used": 573931520, "data_bytes": 109571242, "num_objects": 69, "num_pgs": 177, "num_pools": 7, "pgs_by_state": [{"count": 177, "state_name": "active+clean"}]}, "progress_events": {}, "quorum": [0, 1, 2], "quorum_age": 7632, "quorum_names": ["np0005486728", "np0005486730", "np0005486729"], "servicemap": {"epoch": 68, "modified": "2025-10-14T09:57:57.715578+0000", "services": {}}}}, "changed": false} TASK [ceph_migrate : Dump ceph -s output to log file] ************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Orch ServiceMap] ********************************* changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "ls", "-f", "json"], "delta": "0:00:02.617142", "end": "2025-10-14 09:59:26.578171", "msg": "", "rc": 0, "start": "2025-10-14 09:59:23.961029", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"events\": [\"2025-10-14T07:52:24.674730Z service:crash [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2025-10-14T07:50:27.394836Z\", \"last_refresh\": \"2025-10-14T09:49:25.491005Z\", \"running\": 6, \"size\": 6}}, {\"events\": [\"2025-10-14T08:11:55.583174Z service:mds.mds [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2025-10-14T08:11:49.076867Z\", \"last_refresh\": \"2025-10-14T09:55:45.631759Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:52:12.631559Z service:mgr [INFO] \\\"service was created\\\"\", \"2025-10-14T07:51:18.473153Z service:mgr [ERROR] \\\"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2025-10-14T07:51:08.286779Z\", \"last_refresh\": \"2025-10-14T09:55:45.631387Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:52:04.203143Z service:mon [INFO] \\\"service was created\\\"\", \"2025-10-14T07:51:18.471808Z service:mon [ERROR] \\\"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2025-10-14T07:51:08.277990Z\", \"last_refresh\": \"2025-10-14T09:55:45.631199Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:50:41.259157Z service:node-proxy [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2025-10-14T07:50:41.216505Z\", \"running\": 0, \"size\": 0}}, {\"events\": [\"2025-10-14T07:51:08.326121Z service:osd.default_drive_group [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005486731.localdomain\", \"np0005486732.localdomain\", \"np0005486733.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2025-10-14T07:51:08.318859Z\", \"last_refresh\": \"2025-10-14T09:49:25.491233Z\", \"running\": 6, \"size\": 6}}]", "stdout_lines": ["", "[{\"events\": [\"2025-10-14T07:52:24.674730Z service:crash [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2025-10-14T07:50:27.394836Z\", \"last_refresh\": \"2025-10-14T09:49:25.491005Z\", \"running\": 6, \"size\": 6}}, {\"events\": [\"2025-10-14T08:11:55.583174Z service:mds.mds [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2025-10-14T08:11:49.076867Z\", \"last_refresh\": \"2025-10-14T09:55:45.631759Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:52:12.631559Z service:mgr [INFO] \\\"service was created\\\"\", \"2025-10-14T07:51:18.473153Z service:mgr [ERROR] \\\"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2025-10-14T07:51:08.286779Z\", \"last_refresh\": \"2025-10-14T09:55:45.631387Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:52:04.203143Z service:mon [INFO] \\\"service was created\\\"\", \"2025-10-14T07:51:18.471808Z service:mon [ERROR] \\\"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\\\"\"], \"placement\": {\"hosts\": [\"np0005486728.localdomain\", \"np0005486729.localdomain\", \"np0005486730.localdomain\"]}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2025-10-14T07:51:08.277990Z\", \"last_refresh\": \"2025-10-14T09:55:45.631199Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T07:50:41.259157Z service:node-proxy [INFO] \\\"service was created\\\"\"], \"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2025-10-14T07:50:41.216505Z\", \"running\": 0, \"size\": 0}}, {\"events\": [\"2025-10-14T07:51:08.326121Z service:osd.default_drive_group [INFO] \\\"service was created\\\"\"], \"placement\": {\"hosts\": [\"np0005486731.localdomain\", \"np0005486732.localdomain\", \"np0005486733.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2025-10-14T07:51:08.318859Z\", \"last_refresh\": \"2025-10-14T09:49:25.491233Z\", \"running\": 6, \"size\": 6}}]"]} TASK [ceph_migrate : Load Service Map] ***************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"servicemap": [{"events": ["2025-10-14T07:52:24.674730Z service:crash [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2025-10-14T07:50:27.394836Z", "last_refresh": "2025-10-14T09:49:25.491005Z", "running": 6, "size": 6}}, {"events": ["2025-10-14T08:11:55.583174Z service:mds.mds [INFO] \"service was created\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2025-10-14T08:11:49.076867Z", "last_refresh": "2025-10-14T09:55:45.631759Z", "running": 3, "size": 3}}, {"events": ["2025-10-14T07:52:12.631559Z service:mgr [INFO] \"service was created\"", "2025-10-14T07:51:18.473153Z service:mgr [ERROR] \"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2025-10-14T07:51:08.286779Z", "last_refresh": "2025-10-14T09:55:45.631387Z", "running": 3, "size": 3}}, {"events": ["2025-10-14T07:52:04.203143Z service:mon [INFO] \"service was created\"", "2025-10-14T07:51:18.471808Z service:mon [ERROR] \"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_name": "mon", "service_type": "mon", "status": {"created": "2025-10-14T07:51:08.277990Z", "last_refresh": "2025-10-14T09:55:45.631199Z", "running": 3, "size": 3}}, {"events": ["2025-10-14T07:50:41.259157Z service:node-proxy [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2025-10-14T07:50:41.216505Z", "running": 0, "size": 0}}, {"events": ["2025-10-14T07:51:08.326121Z service:osd.default_drive_group [INFO] \"service was created\""], "placement": {"hosts": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2025-10-14T07:51:08.318859Z", "last_refresh": "2025-10-14T09:49:25.491233Z", "running": 6, "size": 6}}]}, "changed": false} TASK [ceph_migrate : Print Service Map] **************************************** skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T07:52:24.674730Z service:crash [INFO] "service was created"'], 'placement': {'host_pattern': '*'}, 'service_name': 'crash', 'service_type': 'crash', 'status': {'created': '2025-10-14T07:50:27.394836Z', 'last_refresh': '2025-10-14T09:49:25.491005Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T07:52:24.674730Z service:crash [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2025-10-14T07:50:27.394836Z", "last_refresh": "2025-10-14T09:49:25.491005Z", "running": 6, "size": 6}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T08:11:55.583174Z service:mds.mds [INFO] "service was created"'], 'placement': {'hosts': ['np0005486728.localdomain', 'np0005486729.localdomain', 'np0005486730.localdomain']}, 'service_id': 'mds', 'service_name': 'mds.mds', 'service_type': 'mds', 'status': {'created': '2025-10-14T08:11:49.076867Z', 'last_refresh': '2025-10-14T09:55:45.631759Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T08:11:55.583174Z service:mds.mds [INFO] \"service was created\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2025-10-14T08:11:49.076867Z", "last_refresh": "2025-10-14T09:55:45.631759Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T07:52:12.631559Z service:mgr [INFO] "service was created"', '2025-10-14T07:51:18.473153Z service:mgr [ERROR] "Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts"'], 'placement': {'hosts': ['np0005486728.localdomain', 'np0005486729.localdomain', 'np0005486730.localdomain']}, 'service_name': 'mgr', 'service_type': 'mgr', 'status': {'created': '2025-10-14T07:51:08.286779Z', 'last_refresh': '2025-10-14T09:55:45.631387Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T07:52:12.631559Z service:mgr [INFO] \"service was created\"", "2025-10-14T07:51:18.473153Z service:mgr [ERROR] \"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2025-10-14T07:51:08.286779Z", "last_refresh": "2025-10-14T09:55:45.631387Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T07:52:04.203143Z service:mon [INFO] "service was created"', '2025-10-14T07:51:18.471808Z service:mon [ERROR] "Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts"'], 'placement': {'hosts': ['np0005486728.localdomain', 'np0005486729.localdomain', 'np0005486730.localdomain']}, 'service_name': 'mon', 'service_type': 'mon', 'status': {'created': '2025-10-14T07:51:08.277990Z', 'last_refresh': '2025-10-14T09:55:45.631199Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T07:52:04.203143Z service:mon [INFO] \"service was created\"", "2025-10-14T07:51:18.471808Z service:mon [ERROR] \"Failed to apply: Cannot place on np0005486730.localdomain: Unknown hosts\""], "placement": {"hosts": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "service_name": "mon", "service_type": "mon", "status": {"created": "2025-10-14T07:51:08.277990Z", "last_refresh": "2025-10-14T09:55:45.631199Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T07:50:41.259157Z service:node-proxy [INFO] "service was created"'], 'placement': {'host_pattern': '*'}, 'service_name': 'node-proxy', 'service_type': 'node-proxy', 'status': {'created': '2025-10-14T07:50:41.216505Z', 'running': 0, 'size': 0}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T07:50:41.259157Z service:node-proxy [INFO] \"service was created\""], "placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2025-10-14T07:50:41.216505Z", "running": 0, "size": 0}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T07:51:08.326121Z service:osd.default_drive_group [INFO] "service was created"'], 'placement': {'hosts': ['np0005486731.localdomain', 'np0005486732.localdomain', 'np0005486733.localdomain']}, 'service_id': 'default_drive_group', 'service_name': 'osd.default_drive_group', 'service_type': 'osd', 'spec': {'data_devices': {'paths': ['/dev/ceph_vg0/ceph_lv0', '/dev/ceph_vg1/ceph_lv1']}, 'filter_logic': 'AND', 'objectstore': 'bluestore'}, 'status': {'created': '2025-10-14T07:51:08.318859Z', 'last_refresh': '2025-10-14T09:49:25.491233Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T07:51:08.326121Z service:osd.default_drive_group [INFO] \"service was created\""], "placement": {"hosts": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2025-10-14T07:51:08.318859Z", "last_refresh": "2025-10-14T09:49:25.491233Z", "running": 6, "size": 6}}} skipping: [np0005486728.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch ls output to log file] ********************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph config] ****************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "config", "dump", "-f", "json"], "delta": "0:00:02.435049", "end": "2025-10-14 09:59:29.689797", "msg": "", "rc": 0, "start": "2025-10-14 09:59:27.254748", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486731\",\"location_type\":\"host\",\"location_value\":\"np0005486731\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486732\",\"location_type\":\"host\",\"location_value\":\"np0005486732\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486733\",\"location_type\":\"host\",\"location_value\":\"np0005486733\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]", "stdout_lines": ["", "[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486731\",\"location_type\":\"host\",\"location_value\":\"np0005486731\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486732\",\"location_type\":\"host\",\"location_value\":\"np0005486732\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486733\",\"location_type\":\"host\",\"location_value\":\"np0005486733\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]"]} TASK [ceph_migrate : Print Ceph config dump] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph config dump output to log file] ***************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph Orch Host Map] *********************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "host", "ls", "-f", "json"], "delta": "0:00:02.672628", "end": "2025-10-14 09:59:32.983222", "msg": "", "rc": 0, "start": "2025-10-14 09:59:30.310594", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005486728.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.104\", \"hostname\": \"np0005486729.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.105\", \"hostname\": \"np0005486730.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.106\", \"hostname\": \"np0005486731.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005486732.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005486733.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005486728.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.104\", \"hostname\": \"np0005486729.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.105\", \"hostname\": \"np0005486730.localdomain\", \"labels\": [\"_admin\", \"mgr\", \"mon\"], \"status\": \"\"}, {\"addr\": \"192.168.122.106\", \"hostname\": \"np0005486731.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005486732.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005486733.localdomain\", \"labels\": [\"osd\"], \"status\": \"\"}]"]} TASK [ceph_migrate : Load nodes] *********************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"nds": [{"addr": "192.168.122.103", "hostname": "np0005486728.localdomain", "labels": ["_admin", "mgr", "mon"], "status": ""}, {"addr": "192.168.122.104", "hostname": "np0005486729.localdomain", "labels": ["_admin", "mgr", "mon"], "status": ""}, {"addr": "192.168.122.105", "hostname": "np0005486730.localdomain", "labels": ["_admin", "mgr", "mon"], "status": ""}, {"addr": "192.168.122.106", "hostname": "np0005486731.localdomain", "labels": ["osd"], "status": ""}, {"addr": "192.168.122.107", "hostname": "np0005486732.localdomain", "labels": ["osd"], "status": ""}, {"addr": "192.168.122.108", "hostname": "np0005486733.localdomain", "labels": ["osd"], "status": ""}]}, "changed": false} TASK [ceph_migrate : Load hostmap List] **************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"hostmap": {"np0005486728.localdomain": ["_admin", "mgr", "mon"], "np0005486729.localdomain": ["_admin", "mgr", "mon"], "np0005486730.localdomain": ["_admin", "mgr", "mon"], "np0005486731.localdomain": ["osd"], "np0005486732.localdomain": ["osd"], "np0005486733.localdomain": ["osd"]}}, "changed": false} TASK [ceph_migrate : Print Host Map] ******************************************* skipping: [np0005486728.localdomain] => (item=np0005486728.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486728.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486729.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486729.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486730.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486730.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486731.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486731.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486732.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486732.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486733.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486733.localdomain"} skipping: [np0005486728.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch host ls output to log file] **************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get Ceph monmap and load data] **************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "mon", "dump", "-f", "json"], "delta": "0:00:02.882613", "end": "2025-10-14 09:59:36.739976", "msg": "", "rc": 0, "start": "2025-10-14 09:59:33.857363", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\ndumped monmap epoch 3", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /var/lib/ceph/fcadf6e2-9176-5818-a8d0-37b19acf8eaf/mon.np0005486728/config", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "dumped monmap epoch 3"], "stdout": "\n{\"epoch\":3,\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"modified\":\"2025-10-14T07:52:05.338454Z\",\"created\":\"2025-10-14T07:49:51.150761Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005486728\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005486730\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005486729\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}", "stdout_lines": ["", "{\"epoch\":3,\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"modified\":\"2025-10-14T07:52:05.338454Z\",\"created\":\"2025-10-14T07:49:51.150761Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005486728\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005486730\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005486729\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}"]} TASK [ceph_migrate : Get Monmap] *********************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_dump": {"created": "2025-10-14T07:49:51.150761Z", "disallowed_leaders: ": "", "election_strategy": 1, "epoch": 3, "features": {"optional": [], "persistent": ["kraken", "luminous", "mimic", "osdmap-prune", "nautilus", "octopus", "pacific", "elector-pinging", "quincy", "reef"]}, "fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "min_mon_release": 18, "min_mon_release_name": "reef", "modified": "2025-10-14T07:52:05.338454Z", "mons": [{"addr": "172.18.0.103:6789/0", "crush_location": "{}", "name": "np0005486728", "priority": 0, "public_addr": "172.18.0.103:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.103:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.103:6789", "nonce": 0, "type": "v1"}]}, "rank": 0, "weight": 0}, {"addr": "172.18.0.105:6789/0", "crush_location": "{}", "name": "np0005486730", "priority": 0, "public_addr": "172.18.0.105:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.105:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.105:6789", "nonce": 0, "type": "v1"}]}, "rank": 1, "weight": 0}, {"addr": "172.18.0.104:6789/0", "crush_location": "{}", "name": "np0005486729", "priority": 0, "public_addr": "172.18.0.104:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.104:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.104:6789", "nonce": 0, "type": "v1"}]}, "rank": 2, "weight": 0}], "quorum": [0, 1, 2], "removed_ranks: ": "", "stretch_mode": false, "tiebreaker_mon": ""}}, "changed": false} TASK [ceph_migrate : Print monmap] ********************************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph mon dump output to log file] ******************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "dump | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Load nodes to decommission] ******************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"decomm_nodes": ["np0005486728.localdomain", "np0005486729.localdomain", "np0005486730.localdomain"]}, "changed": false} TASK [ceph_migrate : Load target nodes] **************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"target_nodes": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "changed": false} TASK [ceph_migrate : Print target nodes] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Print decomm_nodes] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : ansible.builtin.fail if input is not provided] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph is undefined or ceph | length == 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get cluster health] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if health is HEALTH_WARN || HEALTH_ERR] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph.health.status == 'HEALTH_WARN' or ceph.health.status == 'HEALTH_ERR'", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : PgMap] **************************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if PGs are not in active+clean state] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "pgstate != 'active+clean'", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : OSDMap] *************************************************** ok: [np0005486728.localdomain] => { "msg": "100.0" } TASK [ceph_migrate : ansible.builtin.fail if there is an unacceptable OSDs number] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "pct | float < 100", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MonMap] *************************************************** skipping: [np0005486728.localdomain] => {"false_condition": "check_ceph_release | default(false) | bool"} TASK [ceph_migrate : ansible.builtin.fail if Ceph <= Quincy] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "check_ceph_release | default(false) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Mons in quorum] ******************************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if Mons are not in quorum] *********** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph.monmap.num_mons < decomm_nodes | length", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : is Ceph Mgr available] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if Mgr is not available] ************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not ceph.mgrmap.available | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : in progress events] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : ansible.builtin.fail if there are in progress events] ***** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph.progress_events | length > 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Dump Ceph Status] ***************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : set container image base in ceph configuration] *********** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_base", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest"], "delta": "0:00:00.692475", "end": "2025-10-14 09:59:38.974935", "msg": "", "rc": 0, "start": "2025-10-14 09:59:38.282460", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : set alertmanager container image in ceph configuration] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set grafana container image in ceph configuration] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set node-exporter container image in ceph configuration] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : set prometheus container image in ceph configuration] ***** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set haproxy container image in ceph configuration] ******** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_haproxy", "registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest"], "delta": "0:00:00.734646", "end": "2025-10-14 09:59:40.323442", "msg": "", "rc": 0, "start": "2025-10-14 09:59:39.588796", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Set keepalived container image in ceph configuration] ***** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "config", "set", "mgr", "mgr/cephadm/container_image_keepalived", "registry.redhat.io/rhceph/keepalived-rhel9:latest"], "delta": "0:00:00.775383", "end": "2025-10-14 09:59:41.704345", "msg": "", "rc": 0, "start": "2025-10-14 09:59:40.928962", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Update firewall rules on the target nodes] **************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005486728.localdomain => (item=np0005486731.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005486728.localdomain => (item=np0005486732.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/firewall.yaml for np0005486728.localdomain => (item=np0005486733.localdomain) TASK [ceph_migrate : Ensure firewall is temporarily stopped] ******************* ok: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => (item=iptables) => {"ansible_loop_var": "item", "changed": false, "item": "iptables", "name": "iptables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:iptables_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket sysinit.target system.slice basic.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "tripleo-container-shutdown.service network.service shutdown.target edpm-container-shutdown.service network-pre.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "IPv4 firewall with iptables", "DevicePolicy": "auto", "DynamicUser": "no", "Environment": "BOOTUP=serial CONSOLETYPE=serial", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/iptables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "iptables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "iptables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 07:30:47 UTC", "StateChangeTimestampMonotonic": "3134705863", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity"}} changed: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => (item=nftables) => {"ansible_loop_var": "item", "changed": true, "item": "nftables", "name": "nftables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ActiveEnterTimestampMonotonic": "4982769500", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "system.slice systemd-journald.socket basic.target sysinit.target", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Tue 2025-10-14 08:01:35 UTC", "AssertTimestampMonotonic": "4982667642", "Before": "multi-user.target network-pre.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "30621000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ConditionTimestampMonotonic": "4982667641", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ExecMainExitTimestampMonotonic": "4982769158", "ExecMainPID": "41942", "ExecMainStartTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ExecMainStartTimestampMonotonic": "4982683522", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Tue 2025-10-14 08:01:35 UTC", "InactiveExitTimestampMonotonic": "4982683797", "InvocationID": "664214e211ec4588b0252b6e6b1be7af", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 08:01:35 UTC", "StateChangeTimestampMonotonic": "4982769500", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => {"changed": true, "msg": "Block inserted"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_enabled | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ensure firewall is temporarily stopped] ******************* ok: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => (item=iptables) => {"ansible_loop_var": "item", "changed": false, "item": "iptables", "name": "iptables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:iptables_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system.slice basic.target sysinit.target systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "edpm-container-shutdown.service network.service shutdown.target tripleo-container-shutdown.service network-pre.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "IPv4 firewall with iptables", "DevicePolicy": "auto", "DynamicUser": "no", "Environment": "BOOTUP=serial CONSOLETYPE=serial", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/iptables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "iptables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "iptables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "sysinit.target system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 07:30:54 UTC", "StateChangeTimestampMonotonic": "3139206718", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity"}} changed: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => (item=nftables) => {"ansible_loop_var": "item", "changed": true, "item": "nftables", "name": "nftables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Tue 2025-10-14 08:01:34 UTC", "ActiveEnterTimestampMonotonic": "4979770279", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "systemd-journald.socket basic.target system.slice sysinit.target", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Tue 2025-10-14 08:01:34 UTC", "AssertTimestampMonotonic": "4979687831", "Before": "multi-user.target network-pre.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "20038000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Tue 2025-10-14 08:01:34 UTC", "ConditionTimestampMonotonic": "4979687829", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Tue 2025-10-14 08:01:34 UTC", "ExecMainExitTimestampMonotonic": "4979769970", "ExecMainPID": "42838", "ExecMainStartTimestamp": "Tue 2025-10-14 08:01:34 UTC", "ExecMainStartTimestampMonotonic": "4979698146", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Tue 2025-10-14 08:01:34 UTC", "InactiveExitTimestampMonotonic": "4979698510", "InvocationID": "1dfc058659d64052b924d1f1e559f2f6", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "sysinit.target system.slice", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 08:01:34 UTC", "StateChangeTimestampMonotonic": "4979770279", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => {"changed": true, "msg": "Block inserted"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_enabled | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ensure firewall is temporarily stopped] ******************* ok: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => (item=iptables) => {"ansible_loop_var": "item", "changed": false, "item": "iptables", "name": "iptables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:iptables_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket sysinit.target system.slice basic.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "tripleo-container-shutdown.service shutdown.target network-pre.target edpm-container-shutdown.service network.service", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "IPv4 firewall with iptables", "DevicePolicy": "auto", "DynamicUser": "no", "Environment": "BOOTUP=serial CONSOLETYPE=serial", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecReload": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init reload ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init start ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/libexec/iptables/iptables.init ; argv[]=/usr/libexec/iptables/iptables.init stop ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/iptables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "iptables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "iptables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 07:31:00 UTC", "StateChangeTimestampMonotonic": "3076753900", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "disabled", "UtmpMode": "init", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity"}} changed: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => (item=nftables) => {"ansible_loop_var": "item", "changed": true, "item": "nftables", "name": "nftables", "state": "stopped", "status": {"AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestamp": "Tue 2025-10-14 08:01:36 UTC", "ActiveEnterTimestampMonotonic": "4912232883", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target basic.target system.slice systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "yes", "AssertTimestamp": "Tue 2025-10-14 08:01:35 UTC", "AssertTimestampMonotonic": "4912142977", "Before": "network-pre.target shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "34257000", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "yes", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ConditionTimestampMonotonic": "4912142976", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Netfilter Tables", "DevicePolicy": "auto", "Documentation": "\"man:nft(8)\"", "DynamicUser": "no", "ExecMainCode": "1", "ExecMainExitTimestamp": "Tue 2025-10-14 08:01:36 UTC", "ExecMainExitTimestampMonotonic": "4912232693", "ExecMainPID": "42066", "ExecMainStartTimestamp": "Tue 2025-10-14 08:01:35 UTC", "ExecMainStartTimestampMonotonic": "4912144678", "ExecMainStatus": "0", "ExecReload": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset; include \"/etc/sysconfig/nftables.conf\"; ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStart": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/sbin/nft ; argv[]=/sbin/nft -f /etc/sysconfig/nftables.conf ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/sbin/nft ; argv[]=/sbin/nft flush ruleset ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/usr/lib/systemd/system/nftables.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "yes", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "nftables.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Tue 2025-10-14 08:01:35 UTC", "InactiveExitTimestampMonotonic": "4912144878", "InvocationID": "31dba8aa449c4a29b3950580e4f6f816", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "62638", "LimitNPROCSoft": "62638", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "62638", "LimitSIGPENDINGSoft": "62638", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "nftables.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "yes", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "full", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "RemainAfterExit": "yes", "RemoveIPC": "no", "Requires": "system.slice sysinit.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Tue 2025-10-14 08:01:36 UTC", "StateChangeTimestampMonotonic": "4912232883", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "exited", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "100220", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "Wants": "network-pre.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0"}} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (iptables)] *** skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/iptables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/iptables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=/etc/sysconfig/ip6tables) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "item": "/etc/sysconfig/ip6tables", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Ensure firewall is enabled/started - iptables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_type == \"iptables\"", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Ceph Migration - Apply the Ceph cluster rules (nftables)] *** changed: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => {"changed": true, "msg": "Block inserted"} TASK [ceph_migrate : Ensure firewall is enabled/started - nftables] ************ skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_firewall_enabled | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get ceph_cli] ********************************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the dashboard port] *********************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set the dashboard ssl port] ******************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Disable mgr dashboard module (restart)] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Enable mgr dashboard module (restart)] ******************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => (item=['np0005486731.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005486731.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=['np0005486732.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005486732.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=['np0005486733.localdomain', 'monitoring']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": ["np0005486733.localdomain", "monitoring"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : MONITORING - Load Spec from the orchestrator] ************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the loaded data] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : Update the Monitoring Stack spec definition] ************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.monitoring | default(true) | bool"} TASK [ceph_migrate : MONITORING - wait daemons] ******************************** skipping: [np0005486728.localdomain] => (item=grafana) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "grafana", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=prometheus) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "prometheus", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=alertmanager) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "item": "alertmanager", "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : Sleep before moving to the next daemon] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.monitoring | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MDS - Load Spec from the orchestrator] ******************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mds_spec": {"service_name": "mds.mds", "service_type": "mds", "spec": {}}}, "changed": false} TASK [ceph_migrate : Print the resulting MDS spec] ***************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486728.localdomain", "mds"], "delta": "0:00:00.742976", "end": "2025-10-14 09:59:54.347919", "item": ["np0005486728.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:53.604943", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486728.localdomain", "stdout_lines": ["Added label mds to host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486729.localdomain", "mds"], "delta": "0:00:00.678503", "end": "2025-10-14 09:59:55.558548", "item": ["np0005486729.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:54.880045", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486729.localdomain", "stdout_lines": ["Added label mds to host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486730.localdomain", "mds"], "delta": "0:00:00.769837", "end": "2025-10-14 09:59:56.927867", "item": ["np0005486730.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:56.158030", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486730.localdomain", "stdout_lines": ["Added label mds to host np0005486730.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486731.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486731.localdomain", "mds"], "delta": "0:00:00.681988", "end": "2025-10-14 09:59:58.115770", "item": ["np0005486731.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:57.433782", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486731.localdomain", "stdout_lines": ["Added label mds to host np0005486731.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486732.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486732.localdomain", "mds"], "delta": "0:00:00.702406", "end": "2025-10-14 09:59:59.353857", "item": ["np0005486732.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:58.651451", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486732.localdomain", "stdout_lines": ["Added label mds to host np0005486732.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486733.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486733.localdomain", "mds"], "delta": "0:00:00.778577", "end": "2025-10-14 10:00:00.658901", "item": ["np0005486733.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 09:59:59.880324", "stderr": "", "stderr_lines": [], "stdout": "Added label mds to host np0005486733.localdomain", "stdout_lines": ["Added label mds to host np0005486733.localdomain"]} TASK [ceph_migrate : Update the MDS Daemon spec definition] ******************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mds:/home/tripleo-admin/mds:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mds"], "delta": "0:00:00.664028", "end": "2025-10-14 10:00:02.062709", "rc": 0, "start": "2025-10-14 10:00:01.398681", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mds.mds update...", "stdout_lines": ["Scheduled mds.mds update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Wait for the orchestrator to process the spec] ************ Pausing for 30 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2025-10-14 10:00:02.266875", "stderr": "", "stdout": "Paused for 30.03 seconds", "stop": "2025-10-14 10:00:32.297299", "user_input": ""} TASK [ceph_migrate : Reload the updated mdsmap] ******************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "fs", "status", "cephfs", "-f", "json"], "delta": "0:00:00.679448", "end": "2025-10-14 10:00:33.432427", "msg": "", "rc": 0, "start": "2025-10-14 10:00:32.752979", "stderr": "", "stderr_lines": [], "stdout": "\n{\"clients\": [{\"clients\": 0, \"fs\": \"cephfs\"}], \"mds_version\": [{\"daemon\": [\"mds.np0005486730.hzolgi\", \"mds.np0005486728.mdbtxc\", \"mds.np0005486733.tvstmf\", \"mds.np0005486731.onyaog\", \"mds.np0005486729.iznaug\", \"mds.np0005486732.xkownj\"], \"version\": \"ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable)\"}], \"mdsmap\": [{\"caps\": 0, \"dirs\": 12, \"dns\": 10, \"inos\": 13, \"name\": \"mds.np0005486730.hzolgi\", \"rank\": 0, \"rate\": 0, \"state\": \"active\"}, {\"name\": \"mds.np0005486728.mdbtxc\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486733.tvstmf\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486731.onyaog\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486729.iznaug\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486732.xkownj\", \"state\": \"standby\"}], \"pools\": [{\"avail\": 14037185536, \"id\": 7, \"name\": \"manila_metadata\", \"type\": \"metadata\", \"used\": 98304}, {\"avail\": 14037185536, \"id\": 6, \"name\": \"manila_data\", \"type\": \"data\", \"used\": 0}]}", "stdout_lines": ["", "{\"clients\": [{\"clients\": 0, \"fs\": \"cephfs\"}], \"mds_version\": [{\"daemon\": [\"mds.np0005486730.hzolgi\", \"mds.np0005486728.mdbtxc\", \"mds.np0005486733.tvstmf\", \"mds.np0005486731.onyaog\", \"mds.np0005486729.iznaug\", \"mds.np0005486732.xkownj\"], \"version\": \"ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable)\"}], \"mdsmap\": [{\"caps\": 0, \"dirs\": 12, \"dns\": 10, \"inos\": 13, \"name\": \"mds.np0005486730.hzolgi\", \"rank\": 0, \"rate\": 0, \"state\": \"active\"}, {\"name\": \"mds.np0005486728.mdbtxc\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486733.tvstmf\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486731.onyaog\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486729.iznaug\", \"state\": \"standby\"}, {\"name\": \"mds.np0005486732.xkownj\", \"state\": \"standby\"}], \"pools\": [{\"avail\": 14037185536, \"id\": 7, \"name\": \"manila_metadata\", \"type\": \"metadata\", \"used\": 98304}, {\"avail\": 14037185536, \"id\": 6, \"name\": \"manila_data\", \"type\": \"data\", \"used\": 0}]}"]} TASK [ceph_migrate : Get MDS Daemons] ****************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mds_daemons": {"clients": [{"clients": 0, "fs": "cephfs"}], "mds_version": [{"daemon": ["mds.np0005486730.hzolgi", "mds.np0005486728.mdbtxc", "mds.np0005486733.tvstmf", "mds.np0005486731.onyaog", "mds.np0005486729.iznaug", "mds.np0005486732.xkownj"], "version": "ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable)"}], "mdsmap": [{"caps": 0, "dirs": 12, "dns": 10, "inos": 13, "name": "mds.np0005486730.hzolgi", "rank": 0, "rate": 0, "state": "active"}, {"name": "mds.np0005486728.mdbtxc", "state": "standby"}, {"name": "mds.np0005486733.tvstmf", "state": "standby"}, {"name": "mds.np0005486731.onyaog", "state": "standby"}, {"name": "mds.np0005486729.iznaug", "state": "standby"}, {"name": "mds.np0005486732.xkownj", "state": "standby"}], "pools": [{"avail": 14037185536, "id": 7, "name": "manila_metadata", "type": "metadata", "used": 98304}, {"avail": 14037185536, "id": 6, "name": "manila_data", "type": "data", "used": 0}]}}, "changed": false} TASK [ceph_migrate : Print Daemons] ******************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Get MDS daemons that are not part of decomm nodes] ******** skipping: [np0005486728.localdomain] => (item={'caps': 0, 'dirs': 12, 'dns': 10, 'inos': 13, 'name': 'mds.np0005486730.hzolgi', 'rank': 0, 'rate': 0, 'state': 'active'}) => {"ansible_loop_var": "item", "changed": false, "false_condition": "item.state == \"standby\"", "item": {"caps": 0, "dirs": 12, "dns": 10, "inos": 13, "name": "mds.np0005486730.hzolgi", "rank": 0, "rate": 0, "state": "active"}, "skip_reason": "Conditional result was False"} ok: [np0005486728.localdomain] => (item={'name': 'mds.np0005486728.mdbtxc', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005486728.mdbtxc", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005486728.mdbtxc", "state": "standby"}} ok: [np0005486728.localdomain] => (item={'name': 'mds.np0005486733.tvstmf', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005486733.tvstmf", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005486733.tvstmf", "state": "standby"}} ok: [np0005486728.localdomain] => (item={'name': 'mds.np0005486731.onyaog', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005486731.onyaog", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005486731.onyaog", "state": "standby"}} ok: [np0005486728.localdomain] => (item={'name': 'mds.np0005486729.iznaug', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005486729.iznaug", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005486729.iznaug", "state": "standby"}} ok: [np0005486728.localdomain] => (item={'name': 'mds.np0005486732.xkownj', 'state': 'standby'}) => {"ansible_facts": {"mds_aff_daemon": {"name": "mds.np0005486732.xkownj", "state": "standby"}}, "ansible_loop_var": "item", "changed": false, "item": {"name": "mds.np0005486732.xkownj", "state": "standby"}} TASK [ceph_migrate : Affinity daemon selected] ********************************* ok: [np0005486728.localdomain] => { "msg": { "name": "mds.np0005486732.xkownj", "state": "standby" } } TASK [ceph_migrate : Set MDS affinity] ***************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring config set mds.np0005486732.xkownj mds_join_fs cephfs", "delta": "0:00:00.649608", "end": "2025-10-14 10:00:34.886211", "msg": "", "rc": 0, "start": "2025-10-14 10:00:34.236603", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486728.localdomain", "mds"], "delta": "0:00:00.705411", "end": "2025-10-14 10:00:36.271551", "item": ["np0005486728.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 10:00:35.566140", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005486728.localdomain", "stdout_lines": ["Removed label mds from host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486729.localdomain", "mds"], "delta": "0:00:00.824133", "end": "2025-10-14 10:00:37.620739", "item": ["np0005486729.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 10:00:36.796606", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005486729.localdomain", "stdout_lines": ["Removed label mds from host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', 'mds']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486730.localdomain", "mds"], "delta": "0:00:00.677363", "end": "2025-10-14 10:00:38.884931", "item": ["np0005486730.localdomain", "mds"], "msg": "", "rc": 0, "start": "2025-10-14 10:00:38.207568", "stderr": "", "stderr_lines": [], "stdout": "Removed label mds from host np0005486730.localdomain", "stdout_lines": ["Removed label mds from host np0005486730.localdomain"]} TASK [ceph_migrate : Wait daemons] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mds] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mds", "-f", "json"], "delta": "0:00:00.647559", "end": "2025-10-14 10:00:40.238730", "msg": "", "rc": 0, "start": "2025-10-14 10:00:39.591171", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"2e71b617cf95\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.04%\", \"created\": \"2025-10-14T08:11:53.152225Z\", \"daemon_id\": \"mds.np0005486728.mdbtxc\", \"daemon_name\": \"mds.mds.np0005486728.mdbtxc\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:53.239247Z daemon:mds.mds.np0005486728.mdbtxc [INFO] \\\"Deployed mds.mds.np0005486728.mdbtxc on host 'np0005486728.localdomain'\\\"\"], \"hostname\": \"np0005486728.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T09:55:46.204791Z\", \"memory_usage\": 28426895, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:53.041306Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"bd243c8a61ed\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.04%\", \"created\": \"2025-10-14T08:11:55.462686Z\", \"daemon_id\": \"mds.np0005486729.iznaug\", \"daemon_name\": \"mds.mds.np0005486729.iznaug\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:55.546678Z daemon:mds.mds.np0005486729.iznaug [INFO] \\\"Deployed mds.mds.np0005486729.iznaug on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T09:55:46.226087Z\", \"memory_usage\": 28080865, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:55.361547Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"ae34ec31a4b0\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.16%\", \"created\": \"2025-10-14T08:11:51.036241Z\", \"daemon_id\": \"mds.np0005486730.hzolgi\", \"daemon_name\": \"mds.mds.np0005486730.hzolgi\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:51.108594Z daemon:mds.mds.np0005486730.hzolgi [INFO] \\\"Deployed mds.mds.np0005486730.hzolgi on host 'np0005486730.localdomain'\\\"\"], \"hostname\": \"np0005486730.localdomain\", \"is_active\": true, \"last_refresh\": \"2025-10-14T09:55:45.631759Z\", \"memory_usage\": 27273461, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:50.950576Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"5454859a5ca1\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"6.10%\", \"created\": \"2025-10-14T10:00:09.084948Z\", \"daemon_id\": \"mds.np0005486731.onyaog\", \"daemon_name\": \"mds.mds.np0005486731.onyaog\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:09.155739Z daemon:mds.mds.np0005486731.onyaog [INFO] \\\"Deployed mds.mds.np0005486731.onyaog on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.215345Z\", \"memory_usage\": 16536043, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:08.994400Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"4e2b680367a5\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.30%\", \"created\": \"2025-10-14T10:00:06.754661Z\", \"daemon_id\": \"mds.np0005486732.xkownj\", \"daemon_name\": \"mds.mds.np0005486732.xkownj\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:06.829406Z daemon:mds.mds.np0005486732.xkownj [INFO] \\\"Deployed mds.mds.np0005486732.xkownj on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.245934Z\", \"memory_usage\": 15749611, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:06.625634Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"59f0a590df7a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.45%\", \"created\": \"2025-10-14T10:00:04.259650Z\", \"daemon_id\": \"mds.np0005486733.tvstmf\", \"daemon_name\": \"mds.mds.np0005486733.tvstmf\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:04.383604Z daemon:mds.mds.np0005486733.tvstmf [INFO] \\\"Deployed mds.mds.np0005486733.tvstmf on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.091907Z\", \"memory_usage\": 16389242, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:04.138083Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"2e71b617cf95\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.04%\", \"created\": \"2025-10-14T08:11:53.152225Z\", \"daemon_id\": \"mds.np0005486728.mdbtxc\", \"daemon_name\": \"mds.mds.np0005486728.mdbtxc\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:53.239247Z daemon:mds.mds.np0005486728.mdbtxc [INFO] \\\"Deployed mds.mds.np0005486728.mdbtxc on host 'np0005486728.localdomain'\\\"\"], \"hostname\": \"np0005486728.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T09:55:46.204791Z\", \"memory_usage\": 28426895, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:53.041306Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"bd243c8a61ed\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.04%\", \"created\": \"2025-10-14T08:11:55.462686Z\", \"daemon_id\": \"mds.np0005486729.iznaug\", \"daemon_name\": \"mds.mds.np0005486729.iznaug\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:55.546678Z daemon:mds.mds.np0005486729.iznaug [INFO] \\\"Deployed mds.mds.np0005486729.iznaug on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T09:55:46.226087Z\", \"memory_usage\": 28080865, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:55.361547Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"ae34ec31a4b0\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.16%\", \"created\": \"2025-10-14T08:11:51.036241Z\", \"daemon_id\": \"mds.np0005486730.hzolgi\", \"daemon_name\": \"mds.mds.np0005486730.hzolgi\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T08:11:51.108594Z daemon:mds.mds.np0005486730.hzolgi [INFO] \\\"Deployed mds.mds.np0005486730.hzolgi on host 'np0005486730.localdomain'\\\"\"], \"hostname\": \"np0005486730.localdomain\", \"is_active\": true, \"last_refresh\": \"2025-10-14T09:55:45.631759Z\", \"memory_usage\": 27273461, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T08:11:50.950576Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"5454859a5ca1\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"6.10%\", \"created\": \"2025-10-14T10:00:09.084948Z\", \"daemon_id\": \"mds.np0005486731.onyaog\", \"daemon_name\": \"mds.mds.np0005486731.onyaog\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:09.155739Z daemon:mds.mds.np0005486731.onyaog [INFO] \\\"Deployed mds.mds.np0005486731.onyaog on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.215345Z\", \"memory_usage\": 16536043, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:08.994400Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"4e2b680367a5\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.30%\", \"created\": \"2025-10-14T10:00:06.754661Z\", \"daemon_id\": \"mds.np0005486732.xkownj\", \"daemon_name\": \"mds.mds.np0005486732.xkownj\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:06.829406Z daemon:mds.mds.np0005486732.xkownj [INFO] \\\"Deployed mds.mds.np0005486732.xkownj on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.245934Z\", \"memory_usage\": 15749611, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:06.625634Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"59f0a590df7a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.45%\", \"created\": \"2025-10-14T10:00:04.259650Z\", \"daemon_id\": \"mds.np0005486733.tvstmf\", \"daemon_name\": \"mds.mds.np0005486733.tvstmf\", \"daemon_type\": \"mds\", \"events\": [\"2025-10-14T10:00:04.383604Z daemon:mds.mds.np0005486733.tvstmf [INFO] \\\"Deployed mds.mds.np0005486733.tvstmf on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:11.091907Z\", \"memory_usage\": 16389242, \"ports\": [], \"service_name\": \"mds.mds\", \"started\": \"2025-10-14T10:00:04.138083Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : Sleep before moving to the next phase] ******************** Pausing for 30 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2025-10-14 10:00:40.382932", "stderr": "", "stdout": "Paused for 30.06 seconds", "stop": "2025-10-14 10:01:10.442170", "user_input": ""} TASK [ceph_migrate : Get ceph_cli] ********************************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if RGW VIPs are not defined] ************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => (item=['np0005486731.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005486731.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=['np0005486732.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005486732.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => (item=['np0005486733.localdomain', 'rgw']) => {"ansible_loop_var": "item", "changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "item": ["np0005486733.localdomain", "rgw"], "skip_reason": "Conditional result was False"} skipping: [np0005486728.localdomain] => {"changed": false, "msg": "All items skipped"} TASK [ceph_migrate : RGW - Load Spec from the orchestrator] ******************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the loaded data] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Apply ceph rgw keystone config] *************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Update the RGW spec definition] *************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Create the Ingress Daemon spec definition for RGW] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "ceph_daemons_layout.rgw | default(true) | bool"} TASK [ceph_migrate : Wait for cephadm to redeploy] ***************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : RGW - wait daemons] *************************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Setup a Ceph client to the first node] ******************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_client.yaml for np0005486728.localdomain TASK [ceph_migrate : TMP_CLIENT - Patch os-net-config config and setup a tmp client IP] *** changed: [np0005486728.localdomain] => {"backup": "/etc/os-net-config/tripleo_config.yaml.486868.2025-10-14@10:01:11~", "changed": true, "msg": "line added and ownership, perms or SE linux context changed"} TASK [ceph_migrate : TMP_CLIENT - Refresh os-net-config] *********************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["os-net-config", "-c", "/etc/os-net-config/tripleo_config.yaml"], "delta": "0:00:07.212776", "end": "2025-10-14 10:01:19.453616", "msg": "", "rc": 0, "start": "2025-10-14 10:01:12.240840", "stderr": "", "stderr_lines": [], "stdout": "2025-10-14 10:01:13.086 ERROR os_net_config.execute stderr : WARN : [ifdown] You are using 'ifdown' script provided by 'network-scripts', which are now deprecated.\nWARN : [ifdown] 'network-scripts' will be removed from distribution in near future.\nWARN : [ifdown] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.\n\n2025-10-14 10:01:19.391 ERROR os_net_config.execute stderr : WARN : [ifup] You are using 'ifup' script provided by 'network-scripts', which are now deprecated.\nWARN : [ifup] 'network-scripts' will be removed from distribution in near future.\nWARN : [ifup] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.", "stdout_lines": ["2025-10-14 10:01:13.086 ERROR os_net_config.execute stderr : WARN : [ifdown] You are using 'ifdown' script provided by 'network-scripts', which are now deprecated.", "WARN : [ifdown] 'network-scripts' will be removed from distribution in near future.", "WARN : [ifdown] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.", "", "2025-10-14 10:01:19.391 ERROR os_net_config.execute stderr : WARN : [ifup] You are using 'ifup' script provided by 'network-scripts', which are now deprecated.", "WARN : [ifup] 'network-scripts' will be removed from distribution in near future.", "WARN : [ifup] It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well."]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005486728.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005486728.localdomain] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005486728.localdomain] => {"changed": false, "examined": 4, "files": [{"atime": 1760429483.305314, "ctime": 1760429482.219281, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 956301985, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760428372.0972443, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 271, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760429483.3153143, "ctime": 1760429482.219281, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 956301984, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760428235.1245375, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760429520.348444, "ctime": 1760429518.3283825, "dev": 64516, "gid": 167, "gr_name": "", "inode": 796955805, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.0583742, "nlink": 1, "path": "/etc/ceph/ceph.client.openstack.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760429521.3984761, "ctime": 1760429519.293412, "dev": 64516, "gid": 167, "gr_name": "", "inode": 822120682, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.9504013, "nlink": 1, "path": "/etc/ceph/ceph.client.manila.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005486728.localdomain] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 271, 'inode': 956301985, 'dev': 64516, 'nlink': 1, 'atime': 1760429483.305314, 'mtime': 1760428372.0972443, 'ctime': 1760429482.219281, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "3ea08ebaa38e66fdc9487ab3279546d8d5630636", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1760429483.305314, "ctime": 1760429482.219281, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 956301985, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760428372.0972443, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 271, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "06d2e6ed974b4cac3fa52dc8c375b161", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 271, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005486728.localdomain] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 956301984, 'dev': 64516, 'nlink': 1, 'atime': 1760429483.3153143, 'mtime': 1760428235.1245375, 'ctime': 1760429482.219281, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "431588685abe2dfd7f03c8784108d8962e66b6df", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1760429483.3153143, "ctime": 1760429482.219281, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 956301984, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760428235.1245375, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "24ba031d88354429461e0fa6a2c9f8ca", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} changed: [np0005486728.localdomain] => (item={'path': '/etc/ceph/ceph.client.openstack.keyring', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 167, 'gid': 167, 'size': 231, 'inode': 796955805, 'dev': 64516, 'nlink': 1, 'atime': 1760429520.348444, 'mtime': 1760429518.0583742, 'ctime': 1760429518.3283825, 'gr_name': '', 'pw_name': '', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "0991400062f1e3522feec6859340320816889889", "dest": "/home/tripleo-admin/ceph_client/ceph.client.openstack.keyring", "gid": 0, "group": "root", "item": {"atime": 1760429520.348444, "ctime": 1760429518.3283825, "dev": 64516, "gid": 167, "gr_name": "", "inode": 796955805, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.0583742, "nlink": 1, "path": "/etc/ceph/ceph.client.openstack.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "f1969ad56204565f1b065784bcb34d60", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 231, "src": "/etc/ceph/ceph.client.openstack.keyring", "state": "file", "uid": 0} changed: [np0005486728.localdomain] => (item={'path': '/etc/ceph/ceph.client.manila.keyring', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 167, 'gid': 167, 'size': 153, 'inode': 822120682, 'dev': 64516, 'nlink': 1, 'atime': 1760429521.3984761, 'mtime': 1760429518.9504013, 'ctime': 1760429519.293412, 'gr_name': '', 'pw_name': '', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "ba6c47c4b62a1635e77f10e9e003b0ff16f31619", "dest": "/home/tripleo-admin/ceph_client/ceph.client.manila.keyring", "gid": 0, "group": "root", "item": {"atime": 1760429521.3984761, "ctime": 1760429519.293412, "dev": 64516, "gid": 167, "gr_name": "", "inode": 822120682, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.9504013, "nlink": 1, "path": "/etc/ceph/ceph.client.manila.keyring", "pw_name": "", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 167, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "dd7469b0657d0c11e2be3e737e995e97", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 153, "src": "/etc/ceph/ceph.client.manila.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : Render global ceph.conf] ********************************** changed: [np0005486728.localdomain] => {"changed": true, "checksum": "4b9917b88c50c9226c36092026baa78b27db1ccb", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "md5sum": "9d8df54682588775cc0c4ccd7bffe88e", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 142, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436083.0869842-57697-6495616255503/source", "state": "file", "uid": 0} TASK [ceph_migrate : MGR - Migrate RBD node] *********************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mgr.yaml for np0005486728.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /etc/ceph:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MGR - Setup Mon/Mgr label to the target node] ************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005486728.localdomain TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005486728.localdomain] => (item=['np0005486731.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486731.localdomain", "mgr"], "delta": "0:00:00.700296", "end": "2025-10-14 10:01:25.679635", "item": ["np0005486731.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:24.979339", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005486731.localdomain", "stdout_lines": ["Added label mgr to host np0005486731.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486732.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486732.localdomain", "mgr"], "delta": "0:00:00.733971", "end": "2025-10-14 10:01:26.933365", "item": ["np0005486732.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:26.199394", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005486732.localdomain", "stdout_lines": ["Added label mgr to host np0005486732.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486733.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486733.localdomain", "mgr"], "delta": "0:00:00.779229", "end": "2025-10-14 10:01:28.290515", "item": ["np0005486733.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:27.511286", "stderr": "", "stderr_lines": [], "stdout": "Added label mgr to host np0005486733.localdomain", "stdout_lines": ["Added label mgr to host np0005486733.localdomain"]} TASK [ceph_migrate : MGR - Load Spec from the orchestrator] ******************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mgr_spec": {"service_name": "mgr", "service_type": "mgr", "spec": {}}}, "changed": false} TASK [ceph_migrate : Update the MGR Daemon spec definition] ******************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mgr:/home/tripleo-admin/mgr:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mgr"], "delta": "0:00:00.702226", "end": "2025-10-14 10:01:29.675398", "rc": 0, "start": "2025-10-14 10:01:28.973172", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mgr update...", "stdout_lines": ["Scheduled mgr update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MGR - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mgr] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mgr", "-f", "json"], "delta": "0:00:00.719416", "end": "2025-10-14 10:01:31.128757", "msg": "", "rc": 0, "start": "2025-10-14 10:01:30.409341", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"ccb1d35fc6fa\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.28%\", \"created\": \"2025-10-14T07:49:58.109381Z\", \"daemon_id\": \"np0005486728.giajub\", \"daemon_name\": \"mgr.np0005486728.giajub\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:55.078935Z daemon:mgr.np0005486728.giajub [INFO] \\\"Reconfigured mgr.np0005486728.giajub on host 'np0005486728.localdomain'\\\"\"], \"hostname\": \"np0005486728.localdomain\", \"is_active\": true, \"last_refresh\": \"2025-10-14T10:00:44.533028Z\", \"memory_usage\": 544840089, \"ports\": [9283, 8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:49:57.976509Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"7ac6c35589c9\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.19%\", \"created\": \"2025-10-14T07:52:12.518560Z\", \"daemon_id\": \"np0005486729.xpybho\", \"daemon_name\": \"mgr.np0005486729.xpybho\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:12.608003Z daemon:mgr.np0005486729.xpybho [INFO] \\\"Deployed mgr.np0005486729.xpybho on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:59.201725Z\", \"memory_usage\": 479408947, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:52:12.385224Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"ec137a046281\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.18%\", \"created\": \"2025-10-14T07:52:06.147470Z\", \"daemon_id\": \"np0005486730.ddfidc\", \"daemon_name\": \"mgr.np0005486730.ddfidc\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:10.431262Z daemon:mgr.np0005486730.ddfidc [INFO] \\\"Deployed mgr.np0005486730.ddfidc on host 'np0005486730.localdomain'\\\"\", \"2025-10-14T07:52:59.259113Z daemon:mgr.np0005486730.ddfidc [INFO] \\\"Reconfigured mgr.np0005486730.ddfidc on host 'np0005486730.localdomain'\\\"\"], \"hostname\": \"np0005486730.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:59.104684Z\", \"memory_usage\": 479408947, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:52:06.030003Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"ccb1d35fc6fa\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.28%\", \"created\": \"2025-10-14T07:49:58.109381Z\", \"daemon_id\": \"np0005486728.giajub\", \"daemon_name\": \"mgr.np0005486728.giajub\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:55.078935Z daemon:mgr.np0005486728.giajub [INFO] \\\"Reconfigured mgr.np0005486728.giajub on host 'np0005486728.localdomain'\\\"\"], \"hostname\": \"np0005486728.localdomain\", \"is_active\": true, \"last_refresh\": \"2025-10-14T10:00:44.533028Z\", \"memory_usage\": 544840089, \"ports\": [9283, 8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:49:57.976509Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"7ac6c35589c9\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.19%\", \"created\": \"2025-10-14T07:52:12.518560Z\", \"daemon_id\": \"np0005486729.xpybho\", \"daemon_name\": \"mgr.np0005486729.xpybho\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:12.608003Z daemon:mgr.np0005486729.xpybho [INFO] \\\"Deployed mgr.np0005486729.xpybho on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:59.201725Z\", \"memory_usage\": 479408947, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:52:12.385224Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}, {\"container_id\": \"ec137a046281\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"0.18%\", \"created\": \"2025-10-14T07:52:06.147470Z\", \"daemon_id\": \"np0005486730.ddfidc\", \"daemon_name\": \"mgr.np0005486730.ddfidc\", \"daemon_type\": \"mgr\", \"events\": [\"2025-10-14T07:52:10.431262Z daemon:mgr.np0005486730.ddfidc [INFO] \\\"Deployed mgr.np0005486730.ddfidc on host 'np0005486730.localdomain'\\\"\", \"2025-10-14T07:52:59.259113Z daemon:mgr.np0005486730.ddfidc [INFO] \\\"Reconfigured mgr.np0005486730.ddfidc on host 'np0005486730.localdomain'\\\"\"], \"hostname\": \"np0005486730.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:00:59.104684Z\", \"memory_usage\": 479408947, \"ports\": [8765], \"service_name\": \"mgr\", \"started\": \"2025-10-14T07:52:06.030003Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : MON - Load Spec from the orchestrator] ******************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_spec": {"service_name": "mon", "service_type": "mon", "spec": {}}}, "changed": false} TASK [ceph_migrate : Set/Unset labels - add] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - add] *********************************** changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486728.localdomain", "mon"], "delta": "0:00:00.717227", "end": "2025-10-14 10:01:32.628625", "item": ["np0005486728.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:31.911398", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486728.localdomain", "stdout_lines": ["Added label mon to host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486728.localdomain", "_admin"], "delta": "0:00:00.722864", "end": "2025-10-14 10:01:33.871985", "item": ["np0005486728.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:33.149121", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486728.localdomain", "stdout_lines": ["Added label _admin to host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486729.localdomain", "mon"], "delta": "0:00:00.714125", "end": "2025-10-14 10:01:35.128474", "item": ["np0005486729.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:34.414349", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486729.localdomain", "stdout_lines": ["Added label mon to host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486729.localdomain", "_admin"], "delta": "0:00:00.681041", "end": "2025-10-14 10:01:36.420384", "item": ["np0005486729.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:35.739343", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486729.localdomain", "stdout_lines": ["Added label _admin to host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486730.localdomain", "mon"], "delta": "0:00:00.819714", "end": "2025-10-14 10:01:37.758663", "item": ["np0005486730.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:36.938949", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486730.localdomain", "stdout_lines": ["Added label mon to host np0005486730.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486730.localdomain", "_admin"], "delta": "0:00:00.700291", "end": "2025-10-14 10:01:39.020163", "item": ["np0005486730.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:38.319872", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486730.localdomain", "stdout_lines": ["Added label _admin to host np0005486730.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486731.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486731.localdomain", "mon"], "delta": "0:00:00.723804", "end": "2025-10-14 10:01:40.321245", "item": ["np0005486731.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:39.597441", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486731.localdomain", "stdout_lines": ["Added label mon to host np0005486731.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486731.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486731.localdomain", "_admin"], "delta": "0:00:00.681404", "end": "2025-10-14 10:01:41.551583", "item": ["np0005486731.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:40.870179", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486731.localdomain", "stdout_lines": ["Added label _admin to host np0005486731.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486732.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486732.localdomain", "mon"], "delta": "0:00:00.656560", "end": "2025-10-14 10:01:42.783565", "item": ["np0005486732.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:42.127005", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486732.localdomain", "stdout_lines": ["Added label mon to host np0005486732.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486732.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486732.localdomain", "_admin"], "delta": "0:00:00.718696", "end": "2025-10-14 10:01:44.089673", "item": ["np0005486732.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:43.370977", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486732.localdomain", "stdout_lines": ["Added label _admin to host np0005486732.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486733.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486733.localdomain", "mon"], "delta": "0:00:00.689880", "end": "2025-10-14 10:01:45.405870", "item": ["np0005486733.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:44.715990", "stderr": "", "stderr_lines": [], "stdout": "Added label mon to host np0005486733.localdomain", "stdout_lines": ["Added label mon to host np0005486733.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486733.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "add", "np0005486733.localdomain", "_admin"], "delta": "0:00:00.722533", "end": "2025-10-14 10:01:46.650039", "item": ["np0005486733.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:01:45.927506", "stderr": "", "stderr_lines": [], "stdout": "Added label _admin to host np0005486733.localdomain", "stdout_lines": ["Added label _admin to host np0005486733.localdomain"]} TASK [ceph_migrate : Normalize the mon spec to use labels] ********************* ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/etc/ceph:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.718591", "end": "2025-10-14 10:01:47.982711", "rc": 0, "start": "2025-10-14 10:01:47.264120", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : RBD - wait new daemons to be available] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain => (item=np0005486731.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain => (item=np0005486732.localdomain) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain => (item=np0005486733.localdomain) TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* FAILED - RETRYING: [np0005486728.localdomain]: wait for mon (200 retries left). FAILED - RETRYING: [np0005486728.localdomain]: wait for mon (199 retries left). changed: [np0005486728.localdomain] => {"attempts": 3, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486731", "-f", "json"], "delta": "0:00:06.767058", "end": "2025-10-14 10:02:11.607018", "msg": "", "rc": 0, "start": "2025-10-14 10:02:04.839960", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"8bb7ee7976ae\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.41%\", \"created\": \"2025-10-14T10:02:01.285904Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:02:05.592767Z daemon:mon.np0005486731 [INFO] \\\"Deployed mon.np0005486731 on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:08.019672Z\", \"memory_request\": 2147483648, \"memory_usage\": 44795166, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:02:01.189916Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"8bb7ee7976ae\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.41%\", \"created\": \"2025-10-14T10:02:01.285904Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:02:05.592767Z daemon:mon.np0005486731 [INFO] \\\"Deployed mon.np0005486731 on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:08.019672Z\", \"memory_request\": 2147483648, \"memory_usage\": 44795166, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:02:01.189916Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486732", "-f", "json"], "delta": "0:00:00.715008", "end": "2025-10-14 10:02:12.923683", "msg": "", "rc": 0, "start": "2025-10-14 10:02:12.208675", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"95ed365d9db2\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.38%\", \"created\": \"2025-10-14T10:01:56.076483Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:01:58.663530Z daemon:mon.np0005486732 [INFO] \\\"Deployed mon.np0005486732 on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:08.013654Z\", \"memory_request\": 2147483648, \"memory_usage\": 51222937, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:55.976434Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"95ed365d9db2\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.38%\", \"created\": \"2025-10-14T10:01:56.076483Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:01:58.663530Z daemon:mon.np0005486732 [INFO] \\\"Deployed mon.np0005486732 on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:08.013654Z\", \"memory_request\": 2147483648, \"memory_usage\": 51222937, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:55.976434Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/etc/ceph:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486733", "-f", "json"], "delta": "0:00:00.678567", "end": "2025-10-14 10:02:14.310043", "msg": "", "rc": 0, "start": "2025-10-14 10:02:13.631476", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"294d8462825a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.03%\", \"created\": \"2025-10-14T10:01:53.292940Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:01:53.372432Z daemon:mon.np0005486733 [INFO] \\\"Deployed mon.np0005486733 on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:07.763127Z\", \"memory_request\": 2147483648, \"memory_usage\": 48234496, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:53.168491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"294d8462825a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.03%\", \"created\": \"2025-10-14T10:01:53.292940Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:01:53.372432Z daemon:mon.np0005486733 [INFO] \\\"Deployed mon.np0005486733 on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:07.763127Z\", \"memory_request\": 2147483648, \"memory_usage\": 48234496, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:53.168491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : MON - Migrate RBD node] *********************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005486728.localdomain => (item=['np0005486728.localdomain', 'np0005486731.localdomain']) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005486728.localdomain => (item=['np0005486729.localdomain', 'np0005486732.localdomain']) included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/mon.yaml for np0005486728.localdomain => (item=['np0005486730.localdomain', 'np0005486733.localdomain']) TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005486728.localdomain] => { "msg": "Migrate mon: np0005486728.localdomain to node: np0005486731.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.103"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.740912", "end": "2025-10-14 10:02:15.994322", "msg": "", "rc": 0, "start": "2025-10-14 10:02:15.253410", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":28,\"quorum\":[0,1,2,3,4,5],\"quorum_names\":[\"np0005486728\",\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":4,\"monmap\":{\"epoch\":6,\"min_mon_release_name\":\"reef\",\"num_mons\":6},\"osdmap\":{\"epoch\":79,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":596840448,\"bytes_avail\":44475150336,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":71,\"modified\":\"2025-10-14T10:01:49.826736+0000\",\"services\":{\"mgr\":{\"daemons\":{\"summary\":\"\",\"np0005486731.swasqz\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005486732.pasqzz\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005486733.primvu\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}}}}}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":28,\"quorum\":[0,1,2,3,4,5],\"quorum_names\":[\"np0005486728\",\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":4,\"monmap\":{\"epoch\":6,\"min_mon_release_name\":\"reef\",\"num_mons\":6},\"osdmap\":{\"epoch\":79,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109571242,\"bytes_used\":596840448,\"bytes_avail\":44475150336,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":71,\"modified\":\"2025-10-14T10:01:49.826736+0000\",\"services\":{\"mgr\":{\"daemons\":{\"summary\":\"\",\"np0005486731.swasqz\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005486732.pasqzz\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}},\"np0005486733.primvu\":{\"start_epoch\":0,\"start_stamp\":\"0.000000\",\"gid\":0,\"addr\":\"(unrecognized address family 0)/0\",\"metadata\":{},\"task_status\":{}}}}}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "cur_mon != client_node", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.798252", "end": "2025-10-14 10:02:17.480344", "msg": "", "rc": 0, "start": "2025-10-14 10:02:16.682092", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":14,\"available\":true,\"active_name\":\"np0005486728.giajub\",\"num_standby\":5}", "stdout_lines": ["", "{\"epoch\":14,\"available\":true,\"active_name\":\"np0005486728.giajub\",\"num_standby\":5}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005486728.giajub", "available": true, "epoch": 14, "num_standby": 5}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005486728.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.783194", "end": "2025-10-14 10:02:19.059485", "msg": "", "rc": 0, "start": "2025-10-14 10:02:18.276291", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:02:19.169724", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:02:29.182395", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005486728.localdomain: jid=j928460555916.493789 changed: [np0005486728.localdomain] => {"ansible_job_id": "j928460555916.493789", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.753866", "end": "2025-10-14 10:02:30.890694", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j928460555916.493789", "start": "2025-10-14 10:02:30.136828", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005486728.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486728", "-f", "json"], "delta": "0:00:00.626724", "end": "2025-10-14 10:02:33.365964", "msg": "", "rc": 0, "start": "2025-10-14 10:02:32.739240", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"f7e74dd64e2c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.71%\", \"created\": \"2025-10-14T07:49:53.401238Z\", \"daemon_id\": \"np0005486728\", \"daemon_name\": \"mon.np0005486728\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486728.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:20.695025Z\", \"memory_request\": 2147483648, \"memory_usage\": 148163788, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:49:56.412200Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"f7e74dd64e2c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.71%\", \"created\": \"2025-10-14T07:49:53.401238Z\", \"daemon_id\": \"np0005486728\", \"daemon_name\": \"mon.np0005486728\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486728.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:20.695025Z\", \"memory_request\": 2147483648, \"memory_usage\": 148163788, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:49:56.412200Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486728", "--force"], "delta": "0:00:05.981192", "end": "2025-10-14 10:02:39.901167", "msg": "", "rc": 0, "start": "2025-10-14 10:02:33.919975", "stderr": "2025-10-14T10:02:34.663+0000 7fa805d74640 0 --2- 172.18.0.103:0/2494053530 >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x7fa7e0158960 0x7fa7e015ad50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2", "stderr_lines": ["2025-10-14T10:02:34.663+0000 7fa805d74640 0 --2- 172.18.0.103:0/2494053530 >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x7fa7e0158960 0x7fa7e015ad50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2"], "stdout": "Removed mon.np0005486728 from host 'np0005486728.localdomain'", "stdout_lines": ["Removed mon.np0005486728 from host 'np0005486728.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005486728.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486728.localdomain", "mon"], "delta": "0:00:00.738834", "end": "2025-10-14 10:02:41.382227", "item": ["np0005486728.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:02:40.643393", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005486728.localdomain", "stdout_lines": ["Removed label mon from host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486728.localdomain", "mgr"], "delta": "0:00:00.718717", "end": "2025-10-14 10:02:42.697163", "item": ["np0005486728.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:02:41.978446", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005486728.localdomain", "stdout_lines": ["Removed label mgr from host np0005486728.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486728.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486728.localdomain", "_admin"], "delta": "0:00:00.619780", "end": "2025-10-14 10:02:43.884975", "item": ["np0005486728.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:02:43.265195", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005486728.localdomain", "stdout_lines": ["Removed label _admin from host np0005486728.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:02:43.999203", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:02:54.012024", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005486728.localdomain"], "delta": "0:00:00.720578", "end": "2025-10-14 10:02:55.305120", "msg": "", "rc": 0, "start": "2025-10-14 10:02:54.584542", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005486728.localdomain'\ntype id \n-------------------- ---------------\nmgr np0005486728.giajub\ncrash np0005486728 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005486728.localdomain'", "type id ", "-------------------- ---------------", "mgr np0005486728.giajub", "crash np0005486728 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005486728.localdomain", "-f", "json"], "delta": "0:00:00.720855", "end": "2025-10-14 10:02:56.714549", "msg": "", "rc": 0, "start": "2025-10-14 10:02:55.993694", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005486728.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.103\", \"hostname\": \"np0005486728.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005486728.localdomain", "--force"], "delta": "0:00:00.722894", "end": "2025-10-14 10:02:58.106161", "msg": "", "rc": 0, "start": "2025-10-14 10:02:57.383267", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005486728.localdomain'", "stdout_lines": ["Removed host 'np0005486728.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005486728.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005486728.localdomain] => {"backup": "/etc/os-net-config/tripleo_config.yaml.496173.2025-10-14@10:02:58~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.103/24", "dev", "vlan21"], "delta": "0:00:00.004632", "end": "2025-10-14 10:02:59.518452", "msg": "", "rc": 0, "start": "2025-10-14 10:02:59.513820", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.305106.2025-10-14@10:03:00~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.103/24", "dev", "vlan21"], "delta": "0:00:00.004936", "end": "2025-10-14 10:03:01.565551", "msg": "", "rc": 0, "start": "2025-10-14 10:03:01.560615", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005486728.localdomain -> np0005486731.localdomain(192.168.122.106)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.103"], "delta": "0:00:02.054358", "end": "2025-10-14 10:03:04.355517", "msg": "", "rc": 0, "start": "2025-10-14 10:03:02.301159", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.103 (172.18.0.103) 56(84) bytes of data.\n64 bytes from 172.18.0.103: icmp_seq=1 ttl=64 time=0.073 ms\n64 bytes from 172.18.0.103: icmp_seq=2 ttl=64 time=0.039 ms\n64 bytes from 172.18.0.103: icmp_seq=3 ttl=64 time=0.062 ms\n\n--- 172.18.0.103 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2048ms\nrtt min/avg/max/mdev = 0.039/0.058/0.073/0.014 ms", "stdout_lines": ["PING 172.18.0.103 (172.18.0.103) 56(84) bytes of data.", "64 bytes from 172.18.0.103: icmp_seq=1 ttl=64 time=0.073 ms", "64 bytes from 172.18.0.103: icmp_seq=2 ttl=64 time=0.039 ms", "64 bytes from 172.18.0.103: icmp_seq=3 ttl=64 time=0.062 ms", "", "--- 172.18.0.103 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2048ms", "rtt min/avg/max/mdev = 0.039/0.058/0.073/0.014 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.777994", "end": "2025-10-14 10:03:05.797819", "rc": 0, "start": "2025-10-14 10:03:05.019825", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486731", "-f", "json"], "delta": "0:00:00.732112", "end": "2025-10-14 10:03:07.159989", "msg": "", "rc": 0, "start": "2025-10-14 10:03:06.427877", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"8bb7ee7976ae\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.34%\", \"created\": \"2025-10-14T10:02:01.285904Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:02:40.685041Z daemon:mon.np0005486731 [INFO] \\\"Reconfigured mon.np0005486731 on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:21.202110Z\", \"memory_request\": 2147483648, \"memory_usage\": 38241566, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:02:01.189916Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"8bb7ee7976ae\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.34%\", \"created\": \"2025-10-14T10:02:01.285904Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:02:40.685041Z daemon:mon.np0005486731 [INFO] \\\"Reconfigured mon.np0005486731 on host 'np0005486731.localdomain'\\\"\"], \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:02:21.202110Z\", \"memory_request\": 2147483648, \"memory_usage\": 38241566, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:02:01.189916Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486731", "--force"], "delta": "0:00:02.491820", "end": "2025-10-14 10:03:10.256432", "msg": "", "rc": 0, "start": "2025-10-14 10:03:07.764612", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005486731 from host 'np0005486731.localdomain'", "stdout_lines": ["Removed mon.np0005486731 from host 'np0005486731.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:03:10.398858", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:03:20.411019", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005486731.localdomain] *********** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005486731.localdomain] *********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005486731.localdomain:172.18.0.103"], "delta": "0:00:03.344175", "end": "2025-10-14 10:03:24.295604", "msg": "", "rc": 0, "start": "2025-10-14 10:03:20.951429", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005486731 on host 'np0005486731.localdomain'", "stdout_lines": ["Deployed mon.np0005486731 on host 'np0005486731.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:03:24.423614", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:03:34.437093", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.807993", "end": "2025-10-14 10:03:35.745211", "msg": "", "rc": 0, "start": "2025-10-14 10:03:34.937218", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":40,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":3,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":80,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109572084,\"bytes_used\":615383040,\"bytes_avail\":44456607744,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":75,\"modified\":\"2025-10-14T10:03:26.933719+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_OK\",\"checks\":{},\"mutes\":[]},\"election_epoch\":40,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":3,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":80,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109572084,\"bytes_used\":615383040,\"bytes_avail\":44456607744,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":75,\"modified\":\"2025-10-14T10:03:26.933719+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:00.869927", "end": "2025-10-14 10:03:37.240904", "msg": "", "rc": 0, "start": "2025-10-14 10:03:36.370977", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.4 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.1 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.5 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.0 on host 'np0005486733.localdomain'\nScheduled to reconfig osd.3 on host 'np0005486733.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005486733.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005486733.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005486728.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.784014", "end": "2025-10-14 10:03:38.786218", "msg": "", "rc": 0, "start": "2025-10-14 10:03:38.002204", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:03:38.917637", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:03:48.929607", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005486728.localdomain: jid=j55217999562.497901 changed: [np0005486728.localdomain] => {"ansible_job_id": "j55217999562.497901", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.808276", "end": "2025-10-14 10:03:50.470709", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j55217999562.497901", "start": "2025-10-14 10:03:49.662433", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.708410", "end": "2025-10-14 10:03:52.778796", "rc": 0, "start": "2025-10-14 10:03:52.070386", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486731", "-f", "json"], "delta": "0:00:00.649593", "end": "2025-10-14 10:03:54.123225", "msg": "", "rc": 0, "start": "2025-10-14 10:03:53.473632", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"dad8389d42da\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.57%\", \"created\": \"2025-10-14T10:03:24.087095Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:41.148612Z\", \"memory_request\": 2147483648, \"memory_usage\": 57682165, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:03:23.996951Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"dad8389d42da\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.57%\", \"created\": \"2025-10-14T10:03:24.087095Z\", \"daemon_id\": \"np0005486731\", \"daemon_name\": \"mon.np0005486731\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486731.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:41.148612Z\", \"memory_request\": 2147483648, \"memory_usage\": 57682165, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:03:23.996951Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005486728.localdomain] => { "msg": "Migrate mon: np0005486729.localdomain to node: np0005486732.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.104"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.704096", "end": "2025-10-14 10:03:55.663426", "msg": "", "rc": 0, "start": "2025-10-14 10:03:54.959330", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":40,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":23,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":81,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":615452672,\"bytes_avail\":44456538112,\"bytes_total\":45071990784,\"read_bytes_sec\":19094,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":75,\"modified\":\"2025-10-14T10:03:26.933719+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":40,\"quorum\":[0,1,2,3,4],\"quorum_names\":[\"np0005486730\",\"np0005486729\",\"np0005486733\",\"np0005486732\",\"np0005486731\"],\"quorum_age\":23,\"monmap\":{\"epoch\":9,\"min_mon_release_name\":\"reef\",\"num_mons\":5},\"osdmap\":{\"epoch\":81,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":615452672,\"bytes_avail\":44456538112,\"bytes_total\":45071990784,\"read_bytes_sec\":19094,\"write_bytes_sec\":0,\"read_op_per_sec\":8,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":75,\"modified\":\"2025-10-14T10:03:26.933719+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005486728.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"changed": false, "examined": 2, "files": [{"atime": 1760436223.0611515, "ctime": 1760436223.5041652, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 964690480, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760436223.271158, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 367, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760436224.3621917, "ctime": 1760436224.7892048, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 964690481, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760436224.565198, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 2, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 367, 'inode': 964690480, 'dev': 64516, 'nlink': 1, 'atime': 1760436223.0611515, 'mtime': 1760436223.271158, 'ctime': 1760436223.5041652, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "a1110a22abd56f35fa1f7745a845ec9c83563e49", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1760436223.0611515, "ctime": 1760436223.5041652, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 964690480, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760436223.271158, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 367, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "c1c36321c4937e4077d340c983bdd253", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 367, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 964690481, 'dev': 64516, 'nlink': 1, 'atime': 1760436224.3621917, 'mtime': 1760436224.565198, 'ctime': 1760436224.7892048, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "431588685abe2dfd7f03c8784108d8962e66b6df", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1760436224.3621917, "ctime": 1760436224.7892048, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 964690481, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760436224.565198, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "24ba031d88354429461e0fa6a2c9f8ca", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.676596", "end": "2025-10-14 10:04:00.450638", "msg": "", "rc": 0, "start": "2025-10-14 10:03:59.774042", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":24,\"available\":true,\"active_name\":\"np0005486731.swasqz\",\"num_standby\":5}", "stdout_lines": ["", "{\"epoch\":24,\"available\":true,\"active_name\":\"np0005486731.swasqz\",\"num_standby\":5}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005486731.swasqz", "available": true, "epoch": 24, "num_standby": 5}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "mgr.active_name | regex_search(cur_mon | split('.') | first) or mgr.active_name | regex_search(target_node | split('.') | first)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005486728.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486729", "-f", "json"], "delta": "0:00:00.729478", "end": "2025-10-14 10:04:02.028738", "msg": "", "rc": 0, "start": "2025-10-14 10:04:01.299260", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"2b8734807679\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.16%\", \"created\": \"2025-10-14T07:52:01.205199Z\", \"daemon_id\": \"np0005486729\", \"daemon_name\": \"mon.np0005486729\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:04:01.532753Z daemon:mon.np0005486729 [INFO] \\\"Reconfigured mon.np0005486729 on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:40.735917Z\", \"memory_request\": 2147483648, \"memory_usage\": 142501478, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:52:01.067211Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"2b8734807679\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.16%\", \"created\": \"2025-10-14T07:52:01.205199Z\", \"daemon_id\": \"np0005486729\", \"daemon_name\": \"mon.np0005486729\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:04:01.532753Z daemon:mon.np0005486729 [INFO] \\\"Reconfigured mon.np0005486729 on host 'np0005486729.localdomain'\\\"\"], \"hostname\": \"np0005486729.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:40.735917Z\", \"memory_request\": 2147483648, \"memory_usage\": 142501478, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:52:01.067211Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486729", "--force"], "delta": "0:00:03.978396", "end": "2025-10-14 10:04:06.798570", "msg": "", "rc": 0, "start": "2025-10-14 10:04:02.820174", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005486729 from host 'np0005486729.localdomain'", "stdout_lines": ["Removed mon.np0005486729 from host 'np0005486729.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005486728.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486729.localdomain", "mon"], "delta": "0:00:00.667319", "end": "2025-10-14 10:04:08.184125", "item": ["np0005486729.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:04:07.516806", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005486729.localdomain", "stdout_lines": ["Removed label mon from host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486729.localdomain", "mgr"], "delta": "0:00:00.721168", "end": "2025-10-14 10:04:09.492726", "item": ["np0005486729.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:04:08.771558", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005486729.localdomain", "stdout_lines": ["Removed label mgr from host np0005486729.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486729.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486729.localdomain", "_admin"], "delta": "0:00:00.696099", "end": "2025-10-14 10:04:10.733672", "item": ["np0005486729.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:04:10.037573", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005486729.localdomain", "stdout_lines": ["Removed label _admin from host np0005486729.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:04:10.849782", "stderr": "", "stdout": "Paused for 10.02 seconds", "stop": "2025-10-14 10:04:20.869441", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005486729.localdomain"], "delta": "0:00:00.966938", "end": "2025-10-14 10:04:22.514083", "msg": "", "rc": 0, "start": "2025-10-14 10:04:21.547145", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005486729.localdomain'\ntype id \n-------------------- ---------------\ncrash np0005486729 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005486729.localdomain'", "type id ", "-------------------- ---------------", "crash np0005486729 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005486729.localdomain", "-f", "json"], "delta": "0:00:00.700622", "end": "2025-10-14 10:04:23.843461", "msg": "", "rc": 0, "start": "2025-10-14 10:04:23.142839", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.104\", \"hostname\": \"np0005486729.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.104\", \"hostname\": \"np0005486729.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005486729.localdomain", "--force"], "delta": "0:00:00.716260", "end": "2025-10-14 10:04:25.156539", "msg": "", "rc": 0, "start": "2025-10-14 10:04:24.440279", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005486729.localdomain'", "stdout_lines": ["Removed host 'np0005486729.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005486728.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.483362.2025-10-14@10:04:26~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005486728.localdomain -> np0005486729.localdomain(192.168.122.104)] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.104/24", "dev", "vlan21"], "delta": "0:00:00.005253", "end": "2025-10-14 10:04:26.938820", "msg": "", "rc": 0, "start": "2025-10-14 10:04:26.933567", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.309997.2025-10-14@10:04:28~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.104/24", "dev", "vlan21"], "delta": "0:00:00.005019", "end": "2025-10-14 10:04:28.914302", "msg": "", "rc": 0, "start": "2025-10-14 10:04:28.909283", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005486728.localdomain -> np0005486732.localdomain(192.168.122.107)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.104"], "delta": "0:00:02.045330", "end": "2025-10-14 10:04:31.722533", "msg": "", "rc": 0, "start": "2025-10-14 10:04:29.677203", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.104 (172.18.0.104) 56(84) bytes of data.\n64 bytes from 172.18.0.104: icmp_seq=1 ttl=64 time=0.040 ms\n64 bytes from 172.18.0.104: icmp_seq=2 ttl=64 time=0.039 ms\n64 bytes from 172.18.0.104: icmp_seq=3 ttl=64 time=0.034 ms\n\n--- 172.18.0.104 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2040ms\nrtt min/avg/max/mdev = 0.034/0.037/0.040/0.002 ms", "stdout_lines": ["PING 172.18.0.104 (172.18.0.104) 56(84) bytes of data.", "64 bytes from 172.18.0.104: icmp_seq=1 ttl=64 time=0.040 ms", "64 bytes from 172.18.0.104: icmp_seq=2 ttl=64 time=0.039 ms", "64 bytes from 172.18.0.104: icmp_seq=3 ttl=64 time=0.034 ms", "", "--- 172.18.0.104 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2040ms", "rtt min/avg/max/mdev = 0.034/0.037/0.040/0.002 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.605350", "end": "2025-10-14 10:04:32.979689", "rc": 0, "start": "2025-10-14 10:04:32.374339", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486732", "-f", "json"], "delta": "0:00:00.659020", "end": "2025-10-14 10:04:34.224518", "msg": "", "rc": 0, "start": "2025-10-14 10:04:33.565498", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"95ed365d9db2\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.31%\", \"created\": \"2025-10-14T10:01:56.076483Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:04:10.933810Z daemon:mon.np0005486732 [INFO] \\\"Reconfigured mon.np0005486732 on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:41.319437Z\", \"memory_request\": 2147483648, \"memory_usage\": 49125785, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:55.976434Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"95ed365d9db2\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.31%\", \"created\": \"2025-10-14T10:01:56.076483Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:04:10.933810Z daemon:mon.np0005486732 [INFO] \\\"Reconfigured mon.np0005486732 on host 'np0005486732.localdomain'\\\"\"], \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:03:41.319437Z\", \"memory_request\": 2147483648, \"memory_usage\": 49125785, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:55.976434Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486732", "--force"], "delta": "0:00:02.305059", "end": "2025-10-14 10:04:37.170916", "msg": "", "rc": 0, "start": "2025-10-14 10:04:34.865857", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005486732 from host 'np0005486732.localdomain'", "stdout_lines": ["Removed mon.np0005486732 from host 'np0005486732.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:04:37.292218", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:04:47.305542", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005486732.localdomain] *********** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005486732.localdomain] *********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005486732.localdomain:172.18.0.104"], "delta": "0:00:03.539653", "end": "2025-10-14 10:04:51.416946", "msg": "", "rc": 0, "start": "2025-10-14 10:04:47.877293", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005486732 on host 'np0005486732.localdomain'", "stdout_lines": ["Deployed mon.np0005486732 on host 'np0005486732.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:04:51.532292", "stderr": "", "stdout": "Paused for 10.02 seconds", "stop": "2025-10-14 10:05:01.553150", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:07.410405", "end": "2025-10-14 10:05:09.469606", "msg": "", "rc": 0, "start": "2025-10-14 10:05:02.059201", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"MON_DOWN\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1/4 mons down, quorum np0005486730,np0005486733,np0005486731\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":50,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486730\",\"np0005486733\",\"np0005486731\"],\"quorum_age\":0,\"monmap\":{\"epoch\":12,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":81,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":615452672,\"bytes_avail\":44456538112,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":80,\"modified\":\"2025-10-14T10:04:45.787700+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1 stray host(s) with 1 daemon(s) not managed by cephadm\",\"count\":1},\"muted\":false},\"MON_DOWN\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"1/4 mons down, quorum np0005486730,np0005486733,np0005486731\",\"count\":1},\"muted\":false}},\"mutes\":[]},\"election_epoch\":50,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486730\",\"np0005486733\",\"np0005486731\"],\"quorum_age\":0,\"monmap\":{\"epoch\":12,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":81,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109573208,\"bytes_used\":615452672,\"bytes_avail\":44456538112,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":80,\"modified\":\"2025-10-14T10:04:45.787700+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:00.884754", "end": "2025-10-14 10:05:10.957689", "msg": "", "rc": 0, "start": "2025-10-14 10:05:10.072935", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.4 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.1 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.5 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.0 on host 'np0005486733.localdomain'\nScheduled to reconfig osd.3 on host 'np0005486733.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005486733.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005486733.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005486728.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.780751", "end": "2025-10-14 10:05:12.447346", "msg": "", "rc": 0, "start": "2025-10-14 10:05:11.666595", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:05:12.568784", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:05:22.580787", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005486728.localdomain: jid=j381957529699.501365 changed: [np0005486728.localdomain] => {"ansible_job_id": "j381957529699.501365", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.699839", "end": "2025-10-14 10:05:23.993429", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j381957529699.501365", "start": "2025-10-14 10:05:23.293590", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.634524", "end": "2025-10-14 10:05:26.487362", "rc": 0, "start": "2025-10-14 10:05:25.852838", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486732", "-f", "json"], "delta": "0:00:00.765534", "end": "2025-10-14 10:05:27.929257", "msg": "", "rc": 0, "start": "2025-10-14 10:05:27.163723", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"0d1bffb03d57\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.98%\", \"created\": \"2025-10-14T10:04:51.181976Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.505635Z\", \"memory_request\": 2147483648, \"memory_usage\": 50342133, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:04:51.069491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"0d1bffb03d57\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"2.98%\", \"created\": \"2025-10-14T10:04:51.181976Z\", \"daemon_id\": \"np0005486732\", \"daemon_name\": \"mon.np0005486732\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486732.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.505635Z\", \"memory_request\": 2147483648, \"memory_usage\": 50342133, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:04:51.069491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Migrate mon] ********************************************** ok: [np0005486728.localdomain] => { "msg": "Migrate mon: np0005486730.localdomain to node: np0005486733.localdomain" } TASK [ceph_migrate : MON - Get current mon IP address from node_map override] *** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_ipaddr": "172.18.0.105"}, "changed": false} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.691843", "end": "2025-10-14 10:05:29.589463", "msg": "", "rc": 0, "start": "2025-10-14 10:05:28.897620", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray host(s) with 2 daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false}},\"mutes\":[]},\"election_epoch\":52,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005486730\",\"np0005486733\",\"np0005486731\",\"np0005486732\"],\"quorum_age\":18,\"monmap\":{\"epoch\":12,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574332,\"bytes_used\":615526400,\"bytes_avail\":44456464384,\"bytes_total\":45071990784,\"read_bytes_sec\":19779,\"write_bytes_sec\":0,\"read_op_per_sec\":9,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":80,\"modified\":\"2025-10-14T10:04:45.787700+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray host(s) with 2 daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false}},\"mutes\":[]},\"election_epoch\":52,\"quorum\":[0,1,2,3],\"quorum_names\":[\"np0005486730\",\"np0005486733\",\"np0005486731\",\"np0005486732\"],\"quorum_age\":18,\"monmap\":{\"epoch\":12,\"min_mon_release_name\":\"reef\",\"num_mons\":4},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574332,\"bytes_used\":615526400,\"bytes_avail\":44456464384,\"bytes_total\":45071990784,\"read_bytes_sec\":19779,\"write_bytes_sec\":0,\"read_op_per_sec\":9,\"write_op_per_sec\":1},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":80,\"modified\":\"2025-10-14T10:04:45.787700+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Backup data for client purposes] ************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/backup.yaml for np0005486728.localdomain TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"changed": false, "examined": 2, "files": [{"atime": 1760436316.471514, "ctime": 1760436316.9035273, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 973079028, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760436316.6725202, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 319, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760436317.7765539, "ctime": 1760436318.1875663, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 973147915, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760436317.9565594, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 2, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Backup ceph client data] ********************************** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => (item={'path': '/etc/ceph/ceph.conf', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 319, 'inode': 973079028, 'dev': 64516, 'nlink': 1, 'atime': 1760436316.471514, 'mtime': 1760436316.6725202, 'ctime': 1760436316.9035273, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "1201be29d50d9a819b9ce108026578d9065d9118", "dest": "/home/tripleo-admin/ceph_client/ceph.conf", "gid": 0, "group": "root", "item": {"atime": 1760436316.471514, "ctime": 1760436316.9035273, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 973079028, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760436316.6725202, "nlink": 1, "path": "/etc/ceph/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 319, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "c7a19496a7eb94d7d5e7d45f78fb6d92", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 319, "src": "/etc/ceph/ceph.conf", "state": "file", "uid": 0} changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => (item={'path': '/etc/ceph/ceph.client.admin.keyring', 'mode': '0600', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 151, 'inode': 973147915, 'dev': 64516, 'nlink': 1, 'atime': 1760436317.7765539, 'mtime': 1760436317.9565594, 'ctime': 1760436318.1875663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': False, 'xgrp': False, 'woth': False, 'roth': False, 'xoth': False, 'isuid': False, 'isgid': False}) => {"ansible_loop_var": "item", "changed": true, "checksum": "431588685abe2dfd7f03c8784108d8962e66b6df", "dest": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": {"atime": 1760436317.7765539, "ctime": 1760436318.1875663, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 973147915, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0600", "mtime": 1760436317.9565594, "nlink": 1, "path": "/etc/ceph/ceph.client.admin.keyring", "pw_name": "root", "rgrp": false, "roth": false, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, "md5sum": "24ba031d88354429461e0fa6a2c9f8ca", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 151, "src": "/etc/ceph/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : MON - Get the current active mgr] ************************* changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "stat", "-f", "json"], "delta": "0:00:00.707880", "end": "2025-10-14 10:05:34.289307", "msg": "", "rc": 0, "start": "2025-10-14 10:05:33.581427", "stderr": "", "stderr_lines": [], "stdout": "\n{\"epoch\":30,\"available\":true,\"active_name\":\"np0005486732.pasqzz\",\"num_standby\":5}", "stdout_lines": ["", "{\"epoch\":30,\"available\":true,\"active_name\":\"np0005486732.pasqzz\",\"num_standby\":5}"]} TASK [ceph_migrate : MON - Load mgr data] ************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mgr": {"active_name": "np0005486732.pasqzz", "available": true, "epoch": 30, "num_standby": 5}}, "changed": false} TASK [ceph_migrate : Print active mgr] ***************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Fail mgr if active in the current node] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "mgr.active_name | regex_search(cur_mon | split('.') | first) or mgr.active_name | regex_search(target_node | split('.') | first)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Drain and rm --force the cur_mon host] ************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/drain.yaml for np0005486728.localdomain TASK [ceph_migrate : Get ceph_cli] ********************************************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : MON - wait daemons] *************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486730", "-f", "json"], "delta": "0:00:00.724332", "end": "2025-10-14 10:05:35.863466", "msg": "", "rc": 0, "start": "2025-10-14 10:05:35.139134", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"2f3a0ed4fd5c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.07%\", \"created\": \"2025-10-14T07:51:58.881199Z\", \"daemon_id\": \"np0005486730\", \"daemon_name\": \"mon.np0005486730\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486730.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.407142Z\", \"memory_request\": 2147483648, \"memory_usage\": 149107507, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:51:58.757906Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"2f3a0ed4fd5c\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"1.07%\", \"created\": \"2025-10-14T07:51:58.881199Z\", \"daemon_id\": \"np0005486730\", \"daemon_name\": \"mon.np0005486730\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486730.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.407142Z\", \"memory_request\": 2147483648, \"memory_usage\": 149107507, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T07:51:58.757906Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : DRAIN - Delete the mon running on the current controller node] *** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486730", "--force"], "delta": "0:00:02.358875", "end": "2025-10-14 10:05:39.034392", "msg": "", "rc": 0, "start": "2025-10-14 10:05:36.675517", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005486730 from host 'np0005486730.localdomain'", "stdout_lines": ["Removed mon.np0005486730 from host 'np0005486730.localdomain'"]} TASK [ceph_migrate : DRAIN - remove label from the src node] ******************* included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/labels.yaml for np0005486728.localdomain TASK [ceph_migrate : Set/Unset labels - rm] ************************************ skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Print nodes] ********************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Set/Unset labels - rm] ************************************ changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', 'mon']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486730.localdomain", "mon"], "delta": "0:00:03.698877", "end": "2025-10-14 10:05:43.467010", "item": ["np0005486730.localdomain", "mon"], "msg": "", "rc": 0, "start": "2025-10-14 10:05:39.768133", "stderr": "", "stderr_lines": [], "stdout": "Removed label mon from host np0005486730.localdomain", "stdout_lines": ["Removed label mon from host np0005486730.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', 'mgr']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486730.localdomain", "mgr"], "delta": "0:00:00.836076", "end": "2025-10-14 10:05:44.792586", "item": ["np0005486730.localdomain", "mgr"], "msg": "", "rc": 0, "start": "2025-10-14 10:05:43.956510", "stderr": "", "stderr_lines": [], "stdout": "Removed label mgr from host np0005486730.localdomain", "stdout_lines": ["Removed label mgr from host np0005486730.localdomain"]} changed: [np0005486728.localdomain] => (item=['np0005486730.localdomain', '_admin']) => {"ansible_loop_var": "item", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "label", "rm", "np0005486730.localdomain", "_admin"], "delta": "0:00:05.755734", "end": "2025-10-14 10:05:51.055172", "item": ["np0005486730.localdomain", "_admin"], "msg": "", "rc": 0, "start": "2025-10-14 10:05:45.299438", "stderr": "", "stderr_lines": [], "stdout": "Removed label _admin from host np0005486730.localdomain", "stdout_lines": ["Removed label _admin from host np0005486730.localdomain"]} TASK [ceph_migrate : Wait for the orchestrator to remove labels] *************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:05:51.171474", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:06:01.183563", "user_input": ""} TASK [ceph_migrate : DRAIN - Drain the host] *********************************** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "drain", "np0005486730.localdomain"], "delta": "0:00:00.743897", "end": "2025-10-14 10:06:02.614126", "msg": "", "rc": 0, "start": "2025-10-14 10:06:01.870229", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to remove the following daemons from host 'np0005486730.localdomain'\ntype id \n-------------------- ---------------\ncrash np0005486730 ", "stdout_lines": ["Scheduled to remove the following daemons from host 'np0005486730.localdomain'", "type id ", "-------------------- ---------------", "crash np0005486730 "]} TASK [ceph_migrate : DRAIN - cleanup the host] ********************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "force_clean | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - check host in hostmap] ****************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "ls", "--host_pattern", "np0005486730.localdomain", "-f", "json"], "delta": "0:00:00.700333", "end": "2025-10-14 10:06:03.968022", "msg": "", "rc": 0, "start": "2025-10-14 10:06:03.267689", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"addr\": \"192.168.122.105\", \"hostname\": \"np0005486730.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.105\", \"hostname\": \"np0005486730.localdomain\", \"labels\": [\"_no_schedule\", \"_no_conf_keyring\"], \"status\": \"\"}]"]} TASK [ceph_migrate : MON - rm the cur_mon host from the Ceph cluster] ********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "host", "rm", "np0005486730.localdomain", "--force"], "delta": "0:00:00.698059", "end": "2025-10-14 10:06:05.201317", "msg": "", "rc": 0, "start": "2025-10-14 10:06:04.503258", "stderr": "", "stderr_lines": [], "stdout": "Removed host 'np0005486730.localdomain'", "stdout_lines": ["Removed host 'np0005486730.localdomain'"]} TASK [ceph_migrate : MON - Get current mon IP address] ************************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "mon_ipaddr is not defined", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Migrate the mon ip address from src to target node] *** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/network.yaml for np0005486728.localdomain TASK [ceph_migrate : MON - Print current mon IP address] *********************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Patch net-config config and remove the IP address (src node)] *** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.485610.2025-10-14@10:06:06~", "changed": true, "found": 1, "msg": "1 line(s) removed"} TASK [ceph_migrate : MON - Refresh os-net-config (src node)] ******************* skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - manually rm the ip address (src node)] ************** changed: [np0005486728.localdomain -> np0005486730.localdomain(192.168.122.105)] => {"changed": true, "cmd": ["ip", "a", "del", "172.18.0.105/24", "dev", "vlan21"], "delta": "0:00:00.004664", "end": "2025-10-14 10:06:06.856753", "msg": "", "rc": 0, "start": "2025-10-14 10:06:06.852089", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - Patch net-config config and add the IP address (target node)] *** changed: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => {"backup": "/etc/os-net-config/tripleo_config.yaml.314620.2025-10-14@10:06:08~", "changed": true, "msg": "line added"} TASK [ceph_migrate : MON - Refresh os-net-config (target_node)] **************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "not manual_migration | bool | default(false)", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - statically assign the ip address to the target node] *** changed: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => {"changed": true, "cmd": ["ip", "a", "add", "172.18.0.105/24", "dev", "vlan21"], "delta": "0:00:00.006111", "end": "2025-10-14 10:06:08.968551", "msg": "", "rc": 0, "start": "2025-10-14 10:06:08.962440", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : MON - ping ip address to see if is reachable on the target node] *** changed: [np0005486728.localdomain -> np0005486733.localdomain(192.168.122.108)] => {"changed": true, "cmd": ["ping", "-W1", "-c", "3", "172.18.0.105"], "delta": "0:00:02.045426", "end": "2025-10-14 10:06:11.687035", "msg": "", "rc": 0, "start": "2025-10-14 10:06:09.641609", "stderr": "", "stderr_lines": [], "stdout": "PING 172.18.0.105 (172.18.0.105) 56(84) bytes of data.\n64 bytes from 172.18.0.105: icmp_seq=1 ttl=64 time=0.049 ms\n64 bytes from 172.18.0.105: icmp_seq=2 ttl=64 time=0.053 ms\n64 bytes from 172.18.0.105: icmp_seq=3 ttl=64 time=0.761 ms\n\n--- 172.18.0.105 ping statistics ---\n3 packets transmitted, 3 received, 0% packet loss, time 2037ms\nrtt min/avg/max/mdev = 0.049/0.287/0.761/0.334 ms", "stdout_lines": ["PING 172.18.0.105 (172.18.0.105) 56(84) bytes of data.", "64 bytes from 172.18.0.105: icmp_seq=1 ttl=64 time=0.049 ms", "64 bytes from 172.18.0.105: icmp_seq=2 ttl=64 time=0.053 ms", "64 bytes from 172.18.0.105: icmp_seq=3 ttl=64 time=0.761 ms", "", "--- 172.18.0.105 ping statistics ---", "3 packets transmitted, 3 received, 0% packet loss, time 2037ms", "rtt min/avg/max/mdev = 0.049/0.287/0.761/0.334 ms"]} TASK [ceph_migrate : MON - Fail if the IP address is not active in the target node] *** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ping_target_ip.rc != 0", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Unmanage mons] ******************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.706428", "end": "2025-10-14 10:06:13.050776", "rc": 0, "start": "2025-10-14 10:06:12.344348", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : Print the resulting spec] ********************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Get tmp mon] **************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486733", "-f", "json"], "delta": "0:00:00.690164", "end": "2025-10-14 10:06:14.383160", "msg": "", "rc": 0, "start": "2025-10-14 10:06:13.692996", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"294d8462825a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.88%\", \"created\": \"2025-10-14T10:01:53.292940Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:05:37.251037Z daemon:mon.np0005486733 [INFO] \\\"Reconfigured mon.np0005486733 on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.573389Z\", \"memory_request\": 2147483648, \"memory_usage\": 94927585, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:53.168491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"294d8462825a\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.88%\", \"created\": \"2025-10-14T10:01:53.292940Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"events\": [\"2025-10-14T10:05:37.251037Z daemon:mon.np0005486733 [INFO] \\\"Reconfigured mon.np0005486733 on host 'np0005486733.localdomain'\\\"\"], \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:05:14.573389Z\", \"memory_request\": 2147483648, \"memory_usage\": 94927585, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:01:53.168491Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : MON - Delete the running mon] ***************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "rm", "mon.np0005486733", "--force"], "delta": "0:00:02.166320", "end": "2025-10-14 10:06:17.171197", "msg": "", "rc": 0, "start": "2025-10-14 10:06:15.004877", "stderr": "", "stderr_lines": [], "stdout": "Removed mon.np0005486733 from host 'np0005486733.localdomain'", "stdout_lines": ["Removed mon.np0005486733 from host 'np0005486733.localdomain'"]} TASK [ceph_migrate : MON - Wait for the current mon to be deleted] ************* Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:06:17.303410", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:06:27.317322", "user_input": ""} TASK [ceph_migrate : MON - Redeploy mon on np0005486733.localdomain] *********** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : MON - Redeploy mon on np0005486733.localdomain] *********** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "daemon", "add", "mon", "np0005486733.localdomain:172.18.0.105"], "delta": "0:00:06.515485", "end": "2025-10-14 10:06:34.372580", "msg": "", "rc": 0, "start": "2025-10-14 10:06:27.857095", "stderr": "", "stderr_lines": [], "stdout": "Deployed mon.np0005486733 on host 'np0005486733.localdomain'", "stdout_lines": ["Deployed mon.np0005486733 on host 'np0005486733.localdomain'"]} TASK [ceph_migrate : MON - Wait for the spec to be updated] ******************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:06:34.500893", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:06:44.513900", "user_input": ""} TASK [ceph_migrate : MON - Check mons quorum] ********************************** changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "-s", "-f", "json"], "delta": "0:00:00.786201", "end": "2025-10-14 10:06:45.840124", "msg": "", "rc": 0, "start": "2025-10-14 10:06:45.053923", "stderr": "", "stderr_lines": [], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray host(s) with 2 daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false}},\"mutes\":[]},\"election_epoch\":70,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486731\",\"np0005486732\",\"np0005486733\"],\"quorum_age\":2,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574614,\"bytes_used\":615534592,\"bytes_avail\":44456456192,\"bytes_total\":45071990784,\"write_bytes_sec\":304,\"read_op_per_sec\":0,\"write_op_per_sec\":0},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2025-10-14T10:06:19.010116+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"2 stray host(s) with 2 daemon(s) not managed by cephadm\",\"count\":2},\"muted\":false}},\"mutes\":[]},\"election_epoch\":70,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486731\",\"np0005486732\",\"np0005486733\"],\"quorum_age\":2,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":82,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109574614,\"bytes_used\":615534592,\"bytes_avail\":44456456192,\"bytes_total\":45071990784,\"write_bytes_sec\":304,\"read_op_per_sec\":0,\"write_op_per_sec\":0},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":5,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2025-10-14T10:06:19.010116+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : reconfig osds] ******************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "reconfig", "osd.default_drive_group"], "delta": "0:00:00.998112", "end": "2025-10-14 10:06:47.518422", "msg": "", "rc": 0, "start": "2025-10-14 10:06:46.520310", "stderr": "", "stderr_lines": [], "stdout": "Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.4 on host 'np0005486731.localdomain'\nScheduled to reconfig osd.1 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.5 on host 'np0005486732.localdomain'\nScheduled to reconfig osd.0 on host 'np0005486733.localdomain'\nScheduled to reconfig osd.3 on host 'np0005486733.localdomain'", "stdout_lines": ["Scheduled to reconfig osd.2 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.4 on host 'np0005486731.localdomain'", "Scheduled to reconfig osd.1 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.5 on host 'np0005486732.localdomain'", "Scheduled to reconfig osd.0 on host 'np0005486733.localdomain'", "Scheduled to reconfig osd.3 on host 'np0005486733.localdomain'"]} TASK [ceph_migrate : force-fail ceph mgr] ************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/fail_mgr.yaml for np0005486728.localdomain TASK [ceph_migrate : Refresh ceph_cli] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_cli.yaml for np0005486728.localdomain TASK [ceph_migrate : Set ceph CLI] ********************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph_cli": "podman run --rm --net=host --ipc=host --volume /home/tripleo-admin/ceph_client:/etc/ceph:z --entrypoint ceph registry.redhat.io/rhceph/rhceph-7-rhel9:latest --fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring "}, "changed": false} TASK [ceph_migrate : Force fail ceph mgr] ************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.741716", "end": "2025-10-14 10:06:48.981341", "msg": "", "rc": 0, "start": "2025-10-14 10:06:48.239625", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [ceph_migrate : Wait for cephadm to reconcile] **************************** Pausing for 10 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 10, "echo": true, "rc": 0, "start": "2025-10-14 10:06:49.111097", "stderr": "", "stdout": "Paused for 10.01 seconds", "stop": "2025-10-14 10:06:59.123563", "user_input": ""} TASK [ceph_migrate : Get the ceph orchestrator status with] ******************** ASYNC OK on np0005486728.localdomain: jid=j152494400679.504851 changed: [np0005486728.localdomain] => {"ansible_job_id": "j152494400679.504851", "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "status", "--format", "json"], "delta": "0:00:00.802357", "end": "2025-10-14 10:07:00.597622", "failed_when_result": false, "finished": 1, "msg": "", "rc": 0, "results_file": "/root/.ansible_async/j152494400679.504851", "start": "2025-10-14 10:06:59.795265", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "\n{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}", "stdout_lines": ["", "{\"available\": true, \"backend\": \"cephadm\", \"paused\": false, \"workers\": 10}"]} TASK [ceph_migrate : Restart the active mgr] *********************************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : Fail if ceph orchestrator is still not responding] ******** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "\"Timeout exceeded\" in ceph_orch_status.msg", "skip_reason": "Conditional result was False"} TASK [ceph_migrate : MON - Manage mons] **************************************** ok: [np0005486728.localdomain] => {"changed": false, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "-v", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "-v", "/var/lib/ceph/:/var/lib/ceph/:z", "-v", "/var/log/ceph/:/var/log/ceph/:z", "-v", "/home/tripleo-admin/mon:/home/tripleo-admin/mon:z", "--entrypoint=ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "-n", "client.admin", "-k", "/etc/ceph/ceph.client.admin.keyring", "--cluster", "ceph", "orch", "apply", "--in-file", "/home/tripleo-admin/mon"], "delta": "0:00:00.745910", "end": "2025-10-14 10:07:02.993850", "rc": 0, "start": "2025-10-14 10:07:02.247940", "stderr": "", "stderr_lines": [], "stdout": "Scheduled mon update...", "stdout_lines": ["Scheduled mon update..."]} TASK [ceph_migrate : MON - wait daemons] *************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/wait_daemons.yaml for np0005486728.localdomain TASK [ceph_migrate : print daemon id option] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : wait for mon] ********************************************* changed: [np0005486728.localdomain] => {"attempts": 1, "changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "orch", "ps", "--daemon_type", "mon", "--daemon_id", "np0005486733", "-f", "json"], "delta": "0:00:00.771971", "end": "2025-10-14 10:07:04.486240", "msg": "", "rc": 0, "start": "2025-10-14 10:07:03.714269", "stderr": "", "stderr_lines": [], "stdout": "\n[{\"container_id\": \"3b13a3d859b6\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.90%\", \"created\": \"2025-10-14T10:06:34.166423Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:06:51.740410Z\", \"memory_request\": 2147483648, \"memory_usage\": 60534292, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:06:34.049151Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]", "stdout_lines": ["", "[{\"container_id\": \"3b13a3d859b6\", \"container_image_digests\": [\"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\", \"registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:acbbd001ec0d511cce0af82966a041eabbb032c11399bf602a6ae2a93f8d951f\"], \"container_image_id\": \"2cc8f762c607dd6b63605ebb0d85c92055792cd8c0f98779ebfcae95ceb68e28\", \"container_image_name\": \"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\", \"cpu_percentage\": \"3.90%\", \"created\": \"2025-10-14T10:06:34.166423Z\", \"daemon_id\": \"np0005486733\", \"daemon_name\": \"mon.np0005486733\", \"daemon_type\": \"mon\", \"hostname\": \"np0005486733.localdomain\", \"is_active\": false, \"last_refresh\": \"2025-10-14T10:06:51.740410Z\", \"memory_request\": 2147483648, \"memory_usage\": 60534292, \"ports\": [], \"service_name\": \"mon\", \"started\": \"2025-10-14T10:06:34.049151Z\", \"status\": 1, \"status_desc\": \"running\", \"version\": \"18.2.1-361.el9cp\"}]"]} TASK [ceph_migrate : Sleep before moving to the next mon] ********************** Pausing for 30 seconds ok: [np0005486728.localdomain] => {"changed": false, "delta": 30, "echo": true, "rc": 0, "start": "2025-10-14 10:07:04.642532", "stderr": "", "stdout": "Paused for 30.03 seconds", "stop": "2025-10-14 10:07:34.675262", "user_input": ""} TASK [ceph_migrate : POST - Dump logs] ***************************************** included: /home/zuul/src/github.com/openstack-k8s-operators/data-plane-adoption/tests/roles/ceph_migrate/tasks/ceph_load.yaml for np0005486728.localdomain TASK [ceph_migrate : Check file in the src directory] ************************** ok: [np0005486728.localdomain] => {"changed": false, "examined": 4, "files": [{"atime": 1760436135.5184345, "ctime": 1760436135.3634298, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1258359282, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760436083.7158546, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.conf", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 142, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760436135.5224347, "ctime": 1760436135.3624296, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1166168095, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760428235.1245375, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 151, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760429520.348444, "ctime": 1760436135.3634298, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1166168096, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.0583742, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.openstack.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 231, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1760429521.3984761, "ctime": 1760436135.3624296, "dev": 64516, "gid": 0, "gr_name": "root", "inode": 1166168097, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1760429518.9504013, "nlink": 1, "path": "/home/tripleo-admin/ceph_client/ceph.client.manila.keyring", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 153, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [ceph_migrate : Restore files] ******************************************** changed: [np0005486728.localdomain] => (item=ceph.conf) => {"ansible_loop_var": "item", "changed": true, "checksum": "4b9917b88c50c9226c36092026baa78b27db1ccb", "dest": "/etc/ceph/ceph.conf", "gid": 0, "group": "root", "item": "ceph.conf", "md5sum": "9d8df54682588775cc0c4ccd7bffe88e", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 142, "src": "/home/tripleo-admin/ceph_client/ceph.conf", "state": "file", "uid": 0} changed: [np0005486728.localdomain] => (item=ceph.client.admin.keyring) => {"ansible_loop_var": "item", "changed": true, "checksum": "431588685abe2dfd7f03c8784108d8962e66b6df", "dest": "/etc/ceph/ceph.client.admin.keyring", "gid": 0, "group": "root", "item": "ceph.client.admin.keyring", "md5sum": "24ba031d88354429461e0fa6a2c9f8ca", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 151, "src": "/home/tripleo-admin/ceph_client/ceph.client.admin.keyring", "state": "file", "uid": 0} TASK [ceph_migrate : Ensure backup directory exists] *************************** changed: [np0005486728.localdomain] => {"changed": true, "gid": 1003, "group": "tripleo-admin", "mode": "0755", "owner": "tripleo-admin", "path": "/home/tripleo-admin/ceph_client/logs", "secontext": "unconfined_u:object_r:container_file_t:s0", "size": 6, "state": "directory", "uid": 1003} TASK [ceph_migrate : Get Ceph Health] ****************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "-s", "-f", "json"], "delta": "0:00:04.496592", "end": "2025-10-14 10:07:41.695799", "msg": "", "rc": 0, "start": "2025-10-14 10:07:37.199207", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"3 stray daemon(s) not managed by cephadm\",\"count\":3},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"3 stray host(s) with 3 daemon(s) not managed by cephadm\",\"count\":3},\"muted\":false}},\"mutes\":[]},\"election_epoch\":70,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486731\",\"np0005486732\",\"np0005486733\"],\"quorum_age\":57,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":83,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109575456,\"bytes_used\":615612416,\"bytes_avail\":44456378368,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2025-10-14T10:06:19.010116+0000\",\"services\":{}},\"progress_events\":{}}", "stdout_lines": ["", "{\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"health\":{\"status\":\"HEALTH_WARN\",\"checks\":{\"CEPHADM_STRAY_DAEMON\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"3 stray daemon(s) not managed by cephadm\",\"count\":3},\"muted\":false},\"CEPHADM_STRAY_HOST\":{\"severity\":\"HEALTH_WARN\",\"summary\":{\"message\":\"3 stray host(s) with 3 daemon(s) not managed by cephadm\",\"count\":3},\"muted\":false}},\"mutes\":[]},\"election_epoch\":70,\"quorum\":[0,1,2],\"quorum_names\":[\"np0005486731\",\"np0005486732\",\"np0005486733\"],\"quorum_age\":57,\"monmap\":{\"epoch\":17,\"min_mon_release_name\":\"reef\",\"num_mons\":3},\"osdmap\":{\"epoch\":83,\"num_osds\":6,\"num_up_osds\":6,\"osd_up_since\":1760428371,\"num_in_osds\":6,\"osd_in_since\":1760428350,\"num_remapped_pgs\":0},\"pgmap\":{\"pgs_by_state\":[{\"state_name\":\"active+clean\",\"count\":177}],\"num_pgs\":177,\"num_pools\":7,\"num_objects\":69,\"data_bytes\":109575456,\"bytes_used\":615612416,\"bytes_avail\":44456378368,\"bytes_total\":45071990784},\"fsmap\":{\"epoch\":16,\"id\":1,\"up\":1,\"in\":1,\"max\":1,\"by_rank\":[{\"filesystem_id\":1,\"rank\":0,\"name\":\"mds.np0005486732.xkownj\",\"status\":\"up:active\",\"gid\":26888}],\"up:standby\":2},\"mgrmap\":{\"available\":true,\"num_standbys\":2,\"modules\":[\"cephadm\",\"iostat\",\"nfs\",\"restful\"],\"services\":{}},\"servicemap\":{\"epoch\":84,\"modified\":\"2025-10-14T10:06:19.010116+0000\",\"services\":{}},\"progress_events\":{}}"]} TASK [ceph_migrate : Load ceph data] ******************************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"ceph": {"election_epoch": 70, "fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "fsmap": {"by_rank": [{"filesystem_id": 1, "gid": 26888, "name": "mds.np0005486732.xkownj", "rank": 0, "status": "up:active"}], "epoch": 16, "id": 1, "in": 1, "max": 1, "up": 1, "up:standby": 2}, "health": {"checks": {"CEPHADM_STRAY_DAEMON": {"muted": false, "severity": "HEALTH_WARN", "summary": {"count": 3, "message": "3 stray daemon(s) not managed by cephadm"}}, "CEPHADM_STRAY_HOST": {"muted": false, "severity": "HEALTH_WARN", "summary": {"count": 3, "message": "3 stray host(s) with 3 daemon(s) not managed by cephadm"}}}, "mutes": [], "status": "HEALTH_WARN"}, "mgrmap": {"available": true, "modules": ["cephadm", "iostat", "nfs", "restful"], "num_standbys": 2, "services": {}}, "monmap": {"epoch": 17, "min_mon_release_name": "reef", "num_mons": 3}, "osdmap": {"epoch": 83, "num_in_osds": 6, "num_osds": 6, "num_remapped_pgs": 0, "num_up_osds": 6, "osd_in_since": 1760428350, "osd_up_since": 1760428371}, "pgmap": {"bytes_avail": 44456378368, "bytes_total": 45071990784, "bytes_used": 615612416, "data_bytes": 109575456, "num_objects": 69, "num_pgs": 177, "num_pools": 7, "pgs_by_state": [{"count": 177, "state_name": "active+clean"}]}, "progress_events": {}, "quorum": [0, 1, 2], "quorum_age": 57, "quorum_names": ["np0005486731", "np0005486732", "np0005486733"], "servicemap": {"epoch": 84, "modified": "2025-10-14T10:06:19.010116+0000", "services": {}}}}, "changed": false} TASK [ceph_migrate : Dump ceph -s output to log file] ************************** changed: [np0005486728.localdomain] => {"changed": true, "checksum": "f79fdc5c045a5652c79f1f1413c3ecd16868391c", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_health.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "4db3bd85d2a50c0ad336655e03f632a9", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1436, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436461.9295788-59078-9665884230271/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph Orch ServiceMap] ********************************* changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "ls", "-f", "json"], "delta": "0:00:04.741607", "end": "2025-10-14 10:07:47.999831", "msg": "", "rc": 0, "start": "2025-10-14 10:07:43.258224", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2025-10-14T07:50:27.394836Z\", \"last_refresh\": \"2025-10-14T10:06:51.722171Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mds\"}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2025-10-14T10:00:01.922314Z\", \"last_refresh\": \"2025-10-14T10:06:51.722474Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mgr\"}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2025-10-14T10:01:29.541619Z\", \"last_refresh\": \"2025-10-14T10:06:51.722551Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T10:07:10.346293Z service:mon [INFO] \\\"service was created\\\"\"], \"placement\": {\"label\": \"mon\"}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2025-10-14T10:07:02.843470Z\", \"last_refresh\": \"2025-10-14T10:06:51.722626Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2025-10-14T07:50:41.216505Z\", \"running\": 0, \"size\": 0}}, {\"placement\": {\"hosts\": [\"np0005486731.localdomain\", \"np0005486732.localdomain\", \"np0005486733.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2025-10-14T07:51:08.318859Z\", \"last_refresh\": \"2025-10-14T10:06:51.722307Z\", \"running\": 6, \"size\": 6}}]", "stdout_lines": ["", "[{\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"crash\", \"service_type\": \"crash\", \"status\": {\"created\": \"2025-10-14T07:50:27.394836Z\", \"last_refresh\": \"2025-10-14T10:06:51.722171Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mds\"}, \"service_id\": \"mds\", \"service_name\": \"mds.mds\", \"service_type\": \"mds\", \"status\": {\"created\": \"2025-10-14T10:00:01.922314Z\", \"last_refresh\": \"2025-10-14T10:06:51.722474Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"label\": \"mgr\"}, \"service_name\": \"mgr\", \"service_type\": \"mgr\", \"status\": {\"created\": \"2025-10-14T10:01:29.541619Z\", \"last_refresh\": \"2025-10-14T10:06:51.722551Z\", \"running\": 3, \"size\": 3}}, {\"events\": [\"2025-10-14T10:07:10.346293Z service:mon [INFO] \\\"service was created\\\"\"], \"placement\": {\"label\": \"mon\"}, \"service_name\": \"mon\", \"service_type\": \"mon\", \"status\": {\"created\": \"2025-10-14T10:07:02.843470Z\", \"last_refresh\": \"2025-10-14T10:06:51.722626Z\", \"running\": 3, \"size\": 3}}, {\"placement\": {\"host_pattern\": \"*\"}, \"service_name\": \"node-proxy\", \"service_type\": \"node-proxy\", \"status\": {\"created\": \"2025-10-14T07:50:41.216505Z\", \"running\": 0, \"size\": 0}}, {\"placement\": {\"hosts\": [\"np0005486731.localdomain\", \"np0005486732.localdomain\", \"np0005486733.localdomain\"]}, \"service_id\": \"default_drive_group\", \"service_name\": \"osd.default_drive_group\", \"service_type\": \"osd\", \"spec\": {\"data_devices\": {\"paths\": [\"/dev/ceph_vg0/ceph_lv0\", \"/dev/ceph_vg1/ceph_lv1\"]}, \"filter_logic\": \"AND\", \"objectstore\": \"bluestore\"}, \"status\": {\"created\": \"2025-10-14T07:51:08.318859Z\", \"last_refresh\": \"2025-10-14T10:06:51.722307Z\", \"running\": 6, \"size\": 6}}]"]} TASK [ceph_migrate : Load Service Map] ***************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"servicemap": [{"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2025-10-14T07:50:27.394836Z", "last_refresh": "2025-10-14T10:06:51.722171Z", "running": 3, "size": 3}}, {"placement": {"label": "mds"}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2025-10-14T10:00:01.922314Z", "last_refresh": "2025-10-14T10:06:51.722474Z", "running": 3, "size": 3}}, {"placement": {"label": "mgr"}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2025-10-14T10:01:29.541619Z", "last_refresh": "2025-10-14T10:06:51.722551Z", "running": 3, "size": 3}}, {"events": ["2025-10-14T10:07:10.346293Z service:mon [INFO] \"service was created\""], "placement": {"label": "mon"}, "service_name": "mon", "service_type": "mon", "status": {"created": "2025-10-14T10:07:02.843470Z", "last_refresh": "2025-10-14T10:06:51.722626Z", "running": 3, "size": 3}}, {"placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2025-10-14T07:50:41.216505Z", "running": 0, "size": 0}}, {"placement": {"hosts": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2025-10-14T07:51:08.318859Z", "last_refresh": "2025-10-14T10:06:51.722307Z", "running": 6, "size": 6}}]}, "changed": false} TASK [ceph_migrate : Print Service Map] **************************************** skipping: [np0005486728.localdomain] => (item={'placement': {'host_pattern': '*'}, 'service_name': 'crash', 'service_type': 'crash', 'status': {'created': '2025-10-14T07:50:27.394836Z', 'last_refresh': '2025-10-14T10:06:51.722171Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash", "status": {"created": "2025-10-14T07:50:27.394836Z", "last_refresh": "2025-10-14T10:06:51.722171Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'placement': {'label': 'mds'}, 'service_id': 'mds', 'service_name': 'mds.mds', 'service_type': 'mds', 'status': {'created': '2025-10-14T10:00:01.922314Z', 'last_refresh': '2025-10-14T10:06:51.722474Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"label": "mds"}, "service_id": "mds", "service_name": "mds.mds", "service_type": "mds", "status": {"created": "2025-10-14T10:00:01.922314Z", "last_refresh": "2025-10-14T10:06:51.722474Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'placement': {'label': 'mgr'}, 'service_name': 'mgr', 'service_type': 'mgr', 'status': {'created': '2025-10-14T10:01:29.541619Z', 'last_refresh': '2025-10-14T10:06:51.722551Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"label": "mgr"}, "service_name": "mgr", "service_type": "mgr", "status": {"created": "2025-10-14T10:01:29.541619Z", "last_refresh": "2025-10-14T10:06:51.722551Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'events': ['2025-10-14T10:07:10.346293Z service:mon [INFO] "service was created"'], 'placement': {'label': 'mon'}, 'service_name': 'mon', 'service_type': 'mon', 'status': {'created': '2025-10-14T10:07:02.843470Z', 'last_refresh': '2025-10-14T10:06:51.722626Z', 'running': 3, 'size': 3}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"events": ["2025-10-14T10:07:10.346293Z service:mon [INFO] \"service was created\""], "placement": {"label": "mon"}, "service_name": "mon", "service_type": "mon", "status": {"created": "2025-10-14T10:07:02.843470Z", "last_refresh": "2025-10-14T10:06:51.722626Z", "running": 3, "size": 3}}} skipping: [np0005486728.localdomain] => (item={'placement': {'host_pattern': '*'}, 'service_name': 'node-proxy', 'service_type': 'node-proxy', 'status': {'created': '2025-10-14T07:50:41.216505Z', 'running': 0, 'size': 0}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"host_pattern": "*"}, "service_name": "node-proxy", "service_type": "node-proxy", "status": {"created": "2025-10-14T07:50:41.216505Z", "running": 0, "size": 0}}} skipping: [np0005486728.localdomain] => (item={'placement': {'hosts': ['np0005486731.localdomain', 'np0005486732.localdomain', 'np0005486733.localdomain']}, 'service_id': 'default_drive_group', 'service_name': 'osd.default_drive_group', 'service_type': 'osd', 'spec': {'data_devices': {'paths': ['/dev/ceph_vg0/ceph_lv0', '/dev/ceph_vg1/ceph_lv1']}, 'filter_logic': 'AND', 'objectstore': 'bluestore'}, 'status': {'created': '2025-10-14T07:51:08.318859Z', 'last_refresh': '2025-10-14T10:06:51.722307Z', 'running': 6, 'size': 6}}) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": {"placement": {"hosts": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1"]}, "filter_logic": "AND", "objectstore": "bluestore"}, "status": {"created": "2025-10-14T07:51:08.318859Z", "last_refresh": "2025-10-14T10:06:51.722307Z", "running": 6, "size": 6}}} skipping: [np0005486728.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch ls output to log file] ********************* changed: [np0005486728.localdomain] => {"changed": true, "checksum": "c4ec4d4122db8e100b16e925e37655711f6c45a2", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_orch_ls.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "d93cbfea0d77ae14925da9778db0491e", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1600, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436468.2935684-59107-188626294255358/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph config] ****************************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "config", "dump", "-f", "json"], "delta": "0:00:04.567057", "end": "2025-10-14 10:07:54.256758", "msg": "", "rc": 0, "start": "2025-10-14 10:07:49.689701", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_haproxy\",\"value\":\"registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_keepalived\",\"value\":\"registry.redhat.io/rhceph/keepalived-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486731\",\"location_type\":\"host\",\"location_value\":\"np0005486731\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486732\",\"location_type\":\"host\",\"location_value\":\"np0005486732\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486733\",\"location_type\":\"host\",\"location_value\":\"np0005486733\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.np0005486732.xkownj\",\"name\":\"mds_join_fs\",\"value\":\"cephfs\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]", "stdout_lines": ["", "[{\"section\":\"global\",\"name\":\"cluster_network\",\"value\":\"172.20.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"container_image\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"basic\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv4\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"ms_bind_ipv6\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"global\",\"name\":\"public_network\",\"value\":\"172.18.0.0/24\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mon\",\"name\":\"auth_allow_insecure_global_id_reclaim\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_base\",\"value\":\"registry.redhat.io/rhceph/rhceph-7-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_haproxy\",\"value\":\"registry.redhat.io/rhceph/rhceph-haproxy-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_image_keepalived\",\"value\":\"registry.redhat.io/rhceph/keepalived-rhel9:latest\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/container_init\",\"value\":\"True\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/migration_current\",\"value\":\"7\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/cephadm/use_repo_digest\",\"value\":\"false\",\"level\":\"advanced\",\"can_update_at_runtime\":false,\"mask\":\"\"},{\"section\":\"mgr\",\"name\":\"mgr/orchestrator/orchestrator\",\"value\":\"cephadm\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486731\",\"location_type\":\"host\",\"location_value\":\"np0005486731\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486732\",\"location_type\":\"host\",\"location_value\":\"np0005486732\"},{\"section\":\"osd\",\"name\":\"osd_memory_target\",\"value\":\"5709084876\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"host:np0005486733\",\"location_type\":\"host\",\"location_value\":\"np0005486733\"},{\"section\":\"osd\",\"name\":\"osd_memory_target_autotune\",\"value\":\"true\",\"level\":\"advanced\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.mds\",\"name\":\"mds_join_fs\",\"value\":\"mds\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"},{\"section\":\"mds.np0005486732.xkownj\",\"name\":\"mds_join_fs\",\"value\":\"cephfs\",\"level\":\"basic\",\"can_update_at_runtime\":true,\"mask\":\"\"}]"]} TASK [ceph_migrate : Print Ceph config dump] *********************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph config dump output to log file] ***************** changed: [np0005486728.localdomain] => {"changed": true, "checksum": "03856d40254ac15170fc98bd46775ec36d1d2144", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_config_dump.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "0b47b8c4961b5b63d9fa3a5ecfc02a5e", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 3044, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436474.4803946-59134-28004536690316/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph Orch Host Map] *********************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "orch", "host", "ls", "-f", "json"], "delta": "0:00:04.343995", "end": "2025-10-14 10:08:00.146470", "msg": "", "rc": 0, "start": "2025-10-14 10:07:55.802475", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8"], "stdout": "\n[{\"addr\": \"192.168.122.106\", \"hostname\": \"np0005486731.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005486732.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005486733.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}]", "stdout_lines": ["", "[{\"addr\": \"192.168.122.106\", \"hostname\": \"np0005486731.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.107\", \"hostname\": \"np0005486732.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}, {\"addr\": \"192.168.122.108\", \"hostname\": \"np0005486733.localdomain\", \"labels\": [\"osd\", \"mds\", \"mgr\", \"mon\", \"_admin\"], \"status\": \"\"}]"]} TASK [ceph_migrate : Load nodes] *********************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"nds": [{"addr": "192.168.122.106", "hostname": "np0005486731.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}, {"addr": "192.168.122.107", "hostname": "np0005486732.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}, {"addr": "192.168.122.108", "hostname": "np0005486733.localdomain", "labels": ["osd", "mds", "mgr", "mon", "_admin"], "status": ""}]}, "changed": false} TASK [ceph_migrate : Load hostmap List] **************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"hostmap": {"np0005486731.localdomain": ["osd", "mds", "mgr", "mon", "_admin"], "np0005486732.localdomain": ["osd", "mds", "mgr", "mon", "_admin"], "np0005486733.localdomain": ["osd", "mds", "mgr", "mon", "_admin"]}}, "changed": false} TASK [ceph_migrate : Print Host Map] ******************************************* skipping: [np0005486728.localdomain] => (item=np0005486731.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486731.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486732.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486732.localdomain"} skipping: [np0005486728.localdomain] => (item=np0005486733.localdomain) => {"ansible_loop_var": "item", "false_condition": "debug | default(false)", "item": "np0005486733.localdomain"} skipping: [np0005486728.localdomain] => {"msg": "All items skipped"} TASK [ceph_migrate : Dump ceph orch host ls output to log file] **************** changed: [np0005486728.localdomain] => {"changed": true, "checksum": "6be81cf16fc517df38cf5733d95550d4f8d2cfae", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_orch_host_ls.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "31799669ea57424eb2897907eb280852", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 84, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436480.4757948-59165-174260504444941/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Get Ceph monmap and load data] **************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["cephadm", "shell", "--", "ceph", "mon", "dump", "-f", "json"], "delta": "0:00:04.612306", "end": "2025-10-14 10:08:06.496848", "msg": "", "rc": 0, "start": "2025-10-14 10:08:01.884542", "stderr": "Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf\nInferring config /etc/ceph/ceph.conf\nUsing ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC\nregistry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8\ndumped monmap epoch 17", "stderr_lines": ["Inferring fsid fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "Inferring config /etc/ceph/ceph.conf", "Using ceph image with id '2cc8f762c607' and tag 'latest' created on 2025-09-24 09:04:33 +0000 UTC", "registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:c25fa4419cdc17fed403399149562b7a8aa53b14f4082888ea13258ff48ac8e8", "dumped monmap epoch 17"], "stdout": "\n{\"epoch\":17,\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"modified\":\"2025-10-14T10:06:36.543119Z\",\"created\":\"2025-10-14T07:49:51.150761Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005486731\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005486732\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005486733\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}", "stdout_lines": ["", "{\"epoch\":17,\"fsid\":\"fcadf6e2-9176-5818-a8d0-37b19acf8eaf\",\"modified\":\"2025-10-14T10:06:36.543119Z\",\"created\":\"2025-10-14T07:49:51.150761Z\",\"min_mon_release\":18,\"min_mon_release_name\":\"reef\",\"election_strategy\":1,\"disallowed_leaders: \":\"\",\"stretch_mode\":false,\"tiebreaker_mon\":\"\",\"removed_ranks: \":\"\",\"features\":{\"persistent\":[\"kraken\",\"luminous\",\"mimic\",\"osdmap-prune\",\"nautilus\",\"octopus\",\"pacific\",\"elector-pinging\",\"quincy\",\"reef\"],\"optional\":[]},\"mons\":[{\"rank\":0,\"name\":\"np0005486731\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.103:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.103:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.103:6789/0\",\"public_addr\":\"172.18.0.103:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":1,\"name\":\"np0005486732\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.104:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.104:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.104:6789/0\",\"public_addr\":\"172.18.0.104:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"},{\"rank\":2,\"name\":\"np0005486733\",\"public_addrs\":{\"addrvec\":[{\"type\":\"v2\",\"addr\":\"172.18.0.105:3300\",\"nonce\":0},{\"type\":\"v1\",\"addr\":\"172.18.0.105:6789\",\"nonce\":0}]},\"addr\":\"172.18.0.105:6789/0\",\"public_addr\":\"172.18.0.105:6789/0\",\"priority\":0,\"weight\":0,\"crush_location\":\"{}\"}],\"quorum\":[0,1,2]}"]} TASK [ceph_migrate : Get Monmap] *********************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"mon_dump": {"created": "2025-10-14T07:49:51.150761Z", "disallowed_leaders: ": "", "election_strategy": 1, "epoch": 17, "features": {"optional": [], "persistent": ["kraken", "luminous", "mimic", "osdmap-prune", "nautilus", "octopus", "pacific", "elector-pinging", "quincy", "reef"]}, "fsid": "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "min_mon_release": 18, "min_mon_release_name": "reef", "modified": "2025-10-14T10:06:36.543119Z", "mons": [{"addr": "172.18.0.103:6789/0", "crush_location": "{}", "name": "np0005486731", "priority": 0, "public_addr": "172.18.0.103:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.103:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.103:6789", "nonce": 0, "type": "v1"}]}, "rank": 0, "weight": 0}, {"addr": "172.18.0.104:6789/0", "crush_location": "{}", "name": "np0005486732", "priority": 0, "public_addr": "172.18.0.104:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.104:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.104:6789", "nonce": 0, "type": "v1"}]}, "rank": 1, "weight": 0}, {"addr": "172.18.0.105:6789/0", "crush_location": "{}", "name": "np0005486733", "priority": 0, "public_addr": "172.18.0.105:6789/0", "public_addrs": {"addrvec": [{"addr": "172.18.0.105:3300", "nonce": 0, "type": "v2"}, {"addr": "172.18.0.105:6789", "nonce": 0, "type": "v1"}]}, "rank": 2, "weight": 0}], "quorum": [0, 1, 2], "removed_ranks: ": "", "stretch_mode": false, "tiebreaker_mon": ""}}, "changed": false} TASK [ceph_migrate : Print monmap] ********************************************* skipping: [np0005486728.localdomain] => {"false_condition": "debug | default(false)"} TASK [ceph_migrate : Dump ceph mon dump output to log file] ******************** changed: [np0005486728.localdomain] => {"changed": true, "checksum": "7c0bfdf7e56cefefc18dd98b68285f8c10d6aab5", "dest": "/home/tripleo-admin/ceph_client/logs/ceph_mon_dump.log", "gid": 1003, "group": "tripleo-admin", "md5sum": "84f15240a3d5822b523c037b1814ec98", "mode": "0644", "owner": "tripleo-admin", "secontext": "unconfined_u:object_r:user_home_t:s0", "size": 1425, "src": "/home/tripleo-admin/.ansible/tmp/ansible-tmp-1760436486.735537-59194-4493235021488/source", "state": "file", "uid": 1003} TASK [ceph_migrate : Load nodes to decommission] ******************************* ok: [np0005486728.localdomain] => {"ansible_facts": {"decomm_nodes": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "changed": false} TASK [ceph_migrate : Load target nodes] **************************************** ok: [np0005486728.localdomain] => {"ansible_facts": {"target_nodes": ["np0005486731.localdomain", "np0005486732.localdomain", "np0005486733.localdomain"]}, "changed": false} TASK [ceph_migrate : Print target nodes] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Print decomm_nodes] *************************************** skipping: [np0005486728.localdomain] => {"false_condition": "debug|default(false)"} TASK [ceph_migrate : Configure Swift to use rgw backend] *********************** skipping: [np0005486728.localdomain] => {"changed": false, "false_condition": "ceph_daemons_layout.rgw | default(true) | bool", "skip_reason": "Conditional result was False"} RUNNING HANDLER [ceph_migrate : restart mgr] *********************************** changed: [np0005486728.localdomain] => {"changed": true, "cmd": ["podman", "run", "--rm", "--net=host", "--ipc=host", "--volume", "/home/tripleo-admin/ceph_client:/etc/ceph:z", "--entrypoint", "ceph", "registry.redhat.io/rhceph/rhceph-7-rhel9:latest", "--fsid", "fcadf6e2-9176-5818-a8d0-37b19acf8eaf", "-c", "/etc/ceph/ceph.conf", "-k", "/etc/ceph/ceph.client.admin.keyring", "mgr", "fail"], "delta": "0:00:00.774032", "end": "2025-10-14 10:08:09.156542", "msg": "", "rc": 0, "start": "2025-10-14 10:08:08.382510", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* np0005486728.localdomain : ok=233 changed=109 unreachable=0 failed=0 skipped=140 rescued=0 ignored=0