GPFS Migration Guide

Starting in March 2018, we have begun phasing out our Panasas and Lustre storage systems.  As part of this effort, we are consolidating all users onto our new GPFS storage system.  This includes both user home directories and research group data.

Migrating your Home Directory to GPFS

All users will need to migrate their home directories from Panasas and/or Lustre to GPFS.  To aid in this process, we have provided a utility to make migrating easier.

Before Migrating -- Considerations

  1. Your home directory path will change.  If you have any HPC jobs, Spear jobs, scripts, symlinks, or other programs that reference files or resources in your home directory, you will need to update the paths to reflect your new home directory.  Your new home directory will be /gpfs/home/[username]
  2. Your HPC storage allocation will increase from 100GB to 150GB.  If you have more than 150GB in your Lustre (Spear) and Panasas (HPC) directories combined, you will need to clean out some files before you migrate your data.
  3. If you use our migration tool, your Panasas (HPC) data will be copied directly into your new home directory.  Your Lustre data will be copied into a sub-folder in your home directory named: LUSTRE_HOME_DIRECTORY.  You can re-arrange the files however you like once they are migrated.
  4. Make sure that you do not have multiple SSH sessions open or any HPC jobs running before you migrate your data.  Any process that writes to your home directory may corrupt the migration process.
  5. Migrating typically takes between one and four hours, depending on system load.  Your account will be locked out during this time if you use our migration tool.
  6. After the migration, your Panasas and Lustre home directories will be moved to an archive location, but they will remain available for you to read data from.
  7. No data will be deleted during the migration.

Migration Procedure

You may elect to automatically copy either your Lustre or Panasas data, both, or neither during the migration procedure.  If you elect to skip copying, you can copy the data manually yourself at a later time.  Note that either way, we will archive your old Lustre and Panasas directories.

Login to any hpc-login node:

$ ssh [username]@hpc-login.rcc.fsu.edu

Run gpfs_migrate:

cam02h@hpc-login-25:~ $ gpfs_migrate

Read all of the warnings and notices.  Then select which home directories you want us to copy for you.

 Would you like us to copy your Panasas (HPC) data to GPFS for you? (yes/no) [yes]:
 > 

 Would you like us to copy your Lustre (Spear) data to GPFS for you? (yes/no) [yes]:
 > 

...

This will lock your RCC account until the migration completes.

 Type YES to proceed or NO to cancel:
 > yes

Connection to hpc-login.rcc.fsu.edu closed by remote host.
Connection to hpc-login.rcc.fsu.edu closed.

Your migration will take between 30 minutes and eight hours to complete, depending on system load and how much data we need to copy for you.  You will receive an email when the migration completes, or if something goes wrong.  If you do not receive an email within one day, let us know (support@rcc.fsu.edu).

The next time you login after the migration, your home directory will be on GPFS: /gpfs/home/[username].

Migrating Research Group Data to GPFS

For resource owners (i.e. those who have purchased shared space on Lustre or Panasas), we will reach out individually to coordinate a migration plan for your data.  This is a great oppportunity to re-organize, consolidate, or re-structure your data in the directory tree if you want.  Alternatively, we can copy your data over as-is, and only the the path prefix will change (/panfs/storage.local/[your-path] becomes /gpfs/research/[your-path]).

If you have purchased storage in both Lustre and Panasas, we can consolidate that into a single volume on GPFS, or we can keep it seperate.  We will work with and your team on what your preferences are.

Timeline

We began enabling new user accounts on GPFS in mid-March, 2018.  We anticipate that all users and research groups will be migrated by the end of Summer, 2018.