The ability to collect, store and analyze vast amounts of data is a major challenge in the field of oceanographic studies. Dr. Steve Morey, a physical oceanographer at the Florida State University Center for Ocean and Atmospheric Prediction Studies (COAPS), is very familiar with issues in conducting research at such a scale.
Big data is more than a buzzword for Dr. Morey and his colleague Dr. Dmitry Dukhovskoy, who have been working with the RCC for six years. COAPS has produced, managed and analyzed more than 49 terabytes of data. This data has been used to generate ocean topography models, study how currents affect fisheries and even explore the effects of the Deepwater Horizon oil spill. Morey’s recent forecasting in the Gulf of Mexico will help ensure safe ultra-deepwater exploration and petroleum extractions in the future.
"Taking the vast amount of data in ocean models and putting it into a form that can be presented is one of the most challenging aspects of our work," states Morey. "Our research is critically dependent upon parallel computing systems and large data storage systems. If FSU did not support on-campus research computing, we would have to use off-campus resources, which would dramatically decrease our efficiency."
Dr. Morey and other researchers at COAPS have partnered with the Florida State Research Computing Center to help mitigate these challenges. Using High Performance Computing (HPC) and Spear systems, which allow high-speed processing and interactive data analysis, COAPS is able to collect, store, analyze and generate visual models for their data in an integrated infrastructure.
In Morey’s words, “The resources available today for high performance computing at FSU allow scientists to actively pursue problems that were considered intractable ten years ago.”