Facilities Statement

Feel free to copy relevant portions of this text into your research proposals. This text includes language which is appropriate for inclusion in facilities descriptions for NSF, NIH, and other agency proposals.

The FSU Research Computing Center (RCC) operates within the university as an academic services unit of Information Technology Services (ITS). The RCC Director oversees 7.5 permanent professional staff and a variable number of term limited project staff and students. The RCC staff and students are responsible for maintaining core systems and are assigned to work in support of specific research domain projects. The RCC staff offices are located on the main campus of FSU in the first floor of the Dirac Science Library. Each RCC office is equipped with workstations, monitors, and essential software applications. The RCC also provides two conference rooms with projection and video conferencing capabilities.

High Performance Computing

The FSU HPC system is comprised of 9,280 x86 64-bit compute cores linked together by low-latency infiniband networks for MPI communication. The aggregate peak performance of the system is 201.4 TFLOPS. Compute nodes support between 2 and 16GB of memory per core. A redundant cluster of specailized nodes serve as the user entry points for the system. The Slurm job scheduler handles job submission and secheduling on the HPC system. A broad set of compilers, math and communication libraries, and software applications are made available on the HPC to maximize the utility of the system by users across diverse academic disciplines.

The cluster also includes nodes with 16 NVIDIA Tesla gpGPU cards (m2050) for a total 7,168 GPU cores for a peak system performance of 19.8 TFLOPS.

Interactive Computing and Scientific Visualization

Large data sets generated on the RCC computing resources or by research instrumentation on or off FSU's campus can be interactively analyzed and expolored using the Spear Cluster. The Spear Cluster is comprised of over 280 x86 64-bit process cores linked together by a QDR Infiniband network.  Users log directly into Spear nodes and run jobs interactively through a basic shell or X11-based systems.

DSC Visualization Laboratory

FSU also supports a generall access laboratory for scientific visualization in the Department of Scientific Computing (DSC). The laboratory is located in the center of the main FSU campus on the fourth-floor of the Dirac Science Library (DSL). The Visualization Laboratory includes five high-end visualization workstations eqach equipped with NVIDIA GPU video cards that are compatible with the CUDA SDK.

The University visualization resources also include a high-resolution stereographic projection system, funded by the National Science Foundation, to support multidisciplinary scientific visualization. The system is located in an 80-person seminar room adjacent to the Visualization Lab. Four state-of-the-art rear mounted projectors illumninate an 18'x8' screen. The system switches between 2D and 3D modes with the touch of a button and also supports numerous other input devices.

The FSU Department of Scientific Computing also supports a start-of-the-art classroom for 3D immersive visualization and game design courses. This classroom is equipped with 19 high-end worktations attached to 23" passive 3D monitors. A 62" passive 3D monitor is connected to the instructor workstation. Both the seminar room and classroom support a wide variety of remote collaboration technologies including Access Grid and EVO - The Collaboration Network.

Data Storage

The RCC maintains two high performance parallel file systems to facilitate data anslysis pipelines and workflows. A 256 TB Panasas file system is mounted on all of the HPC compute nodes over a dedicated 1Gbps network. A 344 TB Lustre file system is available through a native interface on the interactive compute nodes and is available to systems and research instrumentation across campus via a cluster of load blaanced NFS servers.

The RCC also maintains a 327 TB general-purpose storage system named "NoleStor", built on top of the DataDirect Networks Web Object Store (WOS) system. This system is provided as a non-high-performance data archival system. Portions of the NoleStor system connect directly to the 10Gbps Florida LamdaRail, connecting it directly to similar systems at UF, UM, and other Florida Universities. This makes high-speed data transfer and collaborative storage possible among major florida universities.

Data Center Facilities and Network Connectivity

Computing and data storage resources managed by the RCC occupy two core data center facilities; one located on the main FSU campus in the Dirac Science Library and th other located a Innovation Park in the Bernard Sliger Building. Both faciliteis are equipped with raised floors and redundant HVAC cooling systems, extensive power distrubtion systems, large format UPS battery backup systems, and deisel-powered backup generators for prolonged outages. Each of the campus data centers is connected to the campus enterprise and research networks via multiple 10Gbps connections. Both data centers also connect directly to the Florida LamdaRail, a dedicated 10GBps regional optical network.

Florida Cyberinfrastructure

Florida State University is a founding member of the Sunshine State Education and Research Computing Alliance (SSERCA), which was created in 2010 to bring together Florida's geographically distributed edicuational institutitons in a way that maximizes their collective impact on research and education. SSERCA provides the management and technology framework to share and access resources distributed across the State of Florida. The alliacne currently supports several projects with sophisticated workflows and complicated data and compute requirements. More details regarding how SSERCA is acclerating research in the State of Florida are available at http://sserca.org. Current member organizations include FAMU, FIT, FIU, FSU, UCF, UFL, UM, UNF, and USF. The Florida LamdaRail provides connectivity to all institutions in the alliance.

Cloud Computing

The RCC supports a cloud computing system build on the OpenStack software platform to allow quick deployment of web, database, and other non-high-performance computing applications. A self-service control panel provides research groups the ability to allocate and deploy virtual machines (VMs) on-the-fly. VMs in this system can connect to and leverage the other storage and compute resources provided at the RCC.