NSF grant to help researchers improve cyberinfrastructure applications using machine learning

Closeup of a super computer
The researchers want to use machine learning to improve the efficiency of computer programming on super computers, such as the Kamiak high performance computer.

Scientists and engineers often have to run massive computer programs with vast amounts of data for projects like modeling the earth’s climate and weather, understanding infectious diseases or sequencing DNA.

They run complex computer programs on powerful supercomputers that can perform thousands of trillions of operations per second. But the programs are often not designed to optimally run on these computers, so programmers end up wasting valuable time and resources.

Jana Doppa and Ananth Kalyanaraman, faculty in the School of Electrical Engineering and Computer Science, recently received a nearly $500,000 National Science Foundation grant to use machine learning to make such state-of-the-art simulations and cyberinfrastructure more efficient and sustainable. The researchers aim to direct supercomputers to automatically find the most efficient ways to run large programs, reducing the burden on resources and on the programmers.

Closeup of Jana Doppa
Jana Doppa

“For the first time ever, we will develop and apply machine learning techniques to systematically explore what kind of designs are possible,” Doppa said.

Just as a chess player gets better by learning from previous games, the team’s machine learning algorithms will learn from each previous execution of the program.

“Depending on the feedback, the system will update the program or its execution on the target computing platform,” Doppa said.

The research will help programmers more easily create and run effective computer programs to tackle cutting-edge problems in science and engineering, making development and deployment of those applications more sustainable in the process.

Closeup of Ananth Kalyanaraman
Ananth Kalyanaraman

“Our approach will automatically learn how to make design decisions to optimize the criteria, like performance or precision, that a programmer chooses,” Kalyanaraman said.

“It will combine algorithm abstractions, programming tools, and machine learning techniques to make cyberinfrastructure applications adapt easily to the problem at hand, instead of being rigid,” Kalyanaraman added.

As part of the grant, the team will work with researchers from Pacific Northwest National Laboratory to broaden research impact, and also develop teaching modules for adoption into undergraduate and graduate classes in parallel algorithms and machine learning.

Next Story

Deadline to test AI scholarly search tools fast approaching

Faculty, staff and students have a limited time to test and provide feedback on three artificial intelligence-powered research platforms currently being trialed by Washington State University Libraries.

Recent News

Carson GPS: Creating a buzz in the greater Puget Sound

A new Carson College initiative is expanding its presence in the Puget Sound by strengthening ties with alumni, major employers and regional business leaders to boost workforce development and engagement.

Science confirms torpedo bat works as well as regular bat

Lab tests show the much-hyped torpedo bat offers no real power advantage over traditional designs, with only a slight shift in the sweet spot that may suit certain hitters.