This page describes how to connect to the High Performance Computing Cluster, and how to use the features available on the login nodes.
To connect to the HPC, you must either be on-campus or be connected to the FSU VPN.
Use an SSH client connect:
For more information using SSH in general, refer to our guide.
Once you authenticate via SSH, you will be connected to one of our three HPC Login nodes. When you first login, you are placed in your home directory, which resides on our GPFS filesystem. Your disk quota is 150gb. If you temporarily need more storage, you may request temporary scratch space.
If you are a member of a research group with extra storage space, you may have access to other directories than your home directory. Refer to your account overview for more information. You can also see what resources you have access to on the HPC by using the RCCTool:
$ rcctool my:account
Tools Available on the HPC
The HPC Login nodes are loaded with many open source and commercial software packages and libraries. In addition, all of the standard Linux utilities are available (text editors such as VI and Nano, CLI tools, etc). See all available software.
Some software packages require you to load modules before you use them. To better understand how modules work, refer to our brief guide.
Copying Data to and From the HPC
You may wish to copy data from external systems (such as your workstation or another server) onto the HPC, or vice versa. For details about how to move data to and from the HPC GPFS filesystem, refer to our guide.
There are three compiler packages installed on the HPC Login Nodes:
Submitting HPC Jobs
The primary purpose of the HPC Login nodes is to enable job submission and management on our powerful HPC cluster. Compute jobs do not run directly on the login nodes; once you compile your code and configure your job submission parameters, they run non-interactively on the HPC.
Refer to our guide for submitting and managing jobs to learn how to submit HPC jobs.