GROMACS
GROMACS
GROMACS is a powerful and versatile molecular dynamics simulation package. It is capable of solving Newtonian Equations of Motions for extremely large systems that can range in the hundreds of millions of particles. It's primary purpose is for simulating biochemical systems with large numbers of complex bonded interactions such as proteins, lipids and nucleic acids. It is also quite capable of simulating non-biological systems as well since the program can compute such non-bonded interactions extremely quickly.
Use of GROMACS on RCC Resources
GROMACS is one of the computational chemistry programs available on both the Spear and the HPC systems. It can be loaded along with the MPI libraries with module load gnu openmpi
. The MPI Library is required for parallel runs. If you want to run gromacs serially and/or interactively, you don't need the openmpi module so you can just run module load gnu
. Once these commands are run, GROMACS commands can be run by prefixing any GROMACS command with gmx
. For example, the pdb2gmx
GROMACS command would be run as gmx pdb2gmx [options and filenames]
. Examples on how to use GROMACS can be found at the GROMACS Tutorial and Documentation pages.
Running Serial GROMACS Jobs on HPC
To use GROMACS on the HPC, you first need to load the GNU environment module:
# Load the GNU module
module load gnu
Then, GROMACS can be run in serial by using the gmx
commands described in the GROMACS documentation For example, to run an energy minimization using molecular dynamics, see the GROMACS Tutorial for more details, the following code would be needed.
gmx mdrun -v -deffnm em
Running Parallel GROMACS Jobs with MPI
GROMACS can be run in parallel on multiple nodes using MPI or on multiple cores in a shared memory system using OpenMP. In order to run a parallel job on multiple nodes on RCC Systems, you first must load the gnu and openmpi modules. Then, a parallel job can be executed by using a SLURM script as shown below. For more information on accelerating GROMACS and running it in parallel, please refer to the documentation on that topic and the documentation available on the mdrun command itself.
GROMACS Example SLURM Submit Script
#!/bin/bash
#SBATCH -J testGROMACS
#SBATCH -n 16
#SBATCH -p backfill
#SBATCH -t 4:00:00
module load gnu openmpi
srun gmx mdrun -v -deffnm em
Save this script in a .sh file. For example, it could be named TESTJOB.sh. Then, submit this to the cluster using the following command.