High Performance Computing: Collaborations power research & learning

In a temperature-controlled room located in an out-of-the-way part of campus, rows of computers are humming, blinking and whirring in a High Performance Computing (HPC) Center called DEAC (Distributed Environment for Academic Computing) Cluster.

Last year, this centralized HPC system processed over 21 million core-hours from more than 650,000 tasks that were submitted by Wake Forest researchers. It also hosts an enormous amount of centralized data supported and maintained by the HPC team within the University’s Information Systems (IS) department. 

According to EdTech Magazine, high performance resources are a competitive differentiator for institutions seeking to hire top researchers—especially those at Universities that support research, teaching and learning and regularly upgrade their networks. A centrally maintained HPC Center means departmental funds and professor’s grants can be used to support teaching and research rather than technology.

Two decades ago, the Wake Forest Physics department formed the first HPC cluster on the Reynolda campus, which has since evolved into a centralized Information Systems resource, used by several departments and the downtown campus and medical school. Since then, IS has invested heavily in the cluster, expanding the team and updating hardware as research needs have grown. Empowering and accelerating researchers and research remains a high priority for the department as outlined in the IT Strategic Plan.

HPC is changing the future of research  

For researchers, what makes high performance computing powerful is, in part, its ability to split data into partitions and accelerate the collection of data by running more than one variation of code at a time.

Adam Carlson, a senior HPC systems administrator in IS, compares HPC to checking out at the grocery store on a busy day. “If there is only one line open, the process takes a long time. But once multiple lanes open and carts can be scanned simultaneously rather than one at a time. You’re done quickly.”

Carlson and fellow administrators Sean Anderson and Cody Stevens form the HPC team. They upload or create code, trouble-shoot and problem-solve for faculty and students from any discipline who want to find fast ways to analyze lots of data. 

“For many of us, the HPC Center is an essential component of our research and teaching efforts,” said physics professor Natalie Holzwarth, a founding member of the physics department cluster in 2002. She uses HPC to model the properties of materials that might be candidates for solid-state electrolytes for use in battery technology.

“Having a well-designed and maintained facility makes it possible to focus on scholarship and educating students. Having a knowledgeable and innovative administrative staff clearly raises the productivity of all involved.” Professor of Physics Natalie Holzwarth

Economics professor Mark Curtis used the DEAC Cluster for research on how environmental outcomes and environmental policy affect workers. Using a data set of more than 1 billion job postings since 2007, he was able to pull together information on what industry the jobs were in, the company hiring and the skills required to do the work.

What does a low carbon economy mean for U.S. workers?

Mark Curtis researches the balance between green jobs and lost jobs – looking at the implications for U.S. workers in a low carbon economy. Read more about his work and research here.

“The research I have done on the HPC would simply not be possible on any laptop or desktop computer and the team has been great to help me problem-solve whenever my research requires computationally intensive methods,” said Curtis.

Student learning

In addition to supporting campus computing, the HPC team takes their energy and enthusiasm into the classroom – teaching an introductory course on HPC. IS investments in both the HPC technology and the staff who design and manage the systems has enabled the HPC team to embed the technologies and themselves in teaching and learning across a variety of disciplines.

“College may be the only time when you have access to HPC hardware. A student with any interest in computer science or data analytics knows how much opportunity this provides,” said Sean Mealey, a student in the 100-level course who is majoring in finance and computer science. “The cluster allows students to process massive amounts of data and do projects that would otherwise be impossible.”

As a freshman, Nathan Whitener joined a research laboratory in the computer science department working with very high-value data, “upward of sometimes 300,000 cells.” With the DEAC Cluster, Whitener said the group could use tools and packages without having to bog down local machines. Now a junior computer science and mathematical statistics double major, his experience in HPC has led to opportunities to present at national conferences. He also earned a URECA scholarship, which provides undergrads the chance to engage in mentored or independent scholarship over the summer.

“High performance computing tools can be leveraged across every field. Scholars are limited only by imagination.” Sean Anderson, HPC Systems Administrator

As computing power has become centralized, faculty in the humanities and social sciences may be more likely to become HPC users, expanding the use of the DEAC Cluster beyond computer science, business, engineering and the hard sciences.

“Research is flexible and fluid and research design can be flexible and fluid as well,” said Anderson. “It is time-consuming and requires different software for different tasks.”

Learn more about Wake Forest’s DEAC Cluster on the HPC website here.

Visit the HPC presentation for Information System’s annual technology showcase, TechX, to learn more about how this team, and others, are helping fuel collaboration, support faculty research and facilitate student learning for the Wake Forest community. 

Categories: Faculty, Research, Scholars, Staff, Student, Top Stories