Supercomputing consortium ponders how to size up and publicize coronavirus projects
The COVID-19 High Performance Computing Consortium is developing processes for measuring progress and publicizing results of its research projects, which have only just begun.
Convened by the White House Office of Science and Technology Policy to speed the work of coronavirus researchers, the growing group of government, industry and academia members discussed information gathering on a board call Monday.
The consortium hasn’t quantified the progress of the 15 research proposals to which it has started supplying more than 402 petaflops of compute power. But plans are in the works to review researchers’ accomplishments on a weekly basis.
“If we find something interesting, for example, on the genome analysis or on the repositioning of existing drugs, a measurement of success is that hopefully we … publish or make public, one way or the other, the results of that,” Paul Dabbar, undersecretary for science at the Department of Energy, told FedScoop. “So that the broader scientific and medical communities can see that.”
A member of the consortium, DOE’s National Laboratories system boasts the No. 1 and No. 2 fastest supercomputers in the world: Summit and Sierra, respectively. Summit alone accounts for about 200 petaflops, or 50 percent, of the consortium’s compute power.
Office of Science Associate Director Barbara Helland leads the team developing the next generation of leading exascale computers at the Argonne, Oak Ridge and Lawrence Livermore national laboratories.
DOE’s Exascale Computing Initiative, which is independent from the consortium, is using the department’s work with the National Cancer Institute and genomics projects that lend themselves well to COVID-19 research to develop a fast, scalable tool for assembling genomic data fragments.
“These tools can be useful in understanding how the microbiomes of the lungs and the digestive system can be affected by COVID-19 and to also analyze environmental samples,” Helland said.
While not as far along, the consortium research proposals DOE is assisting include one from Michigan State University using artificial intelligence to reposition existing drugs for COVID-19 applications.
Another standout proposal comes from the NASA Ames Research Center, which is analyzing the genome of patient groups with a higher risk of infection by the novel coronavirus. High-performance computing can find a specific genomic sequence or marker that decreases risk faster.
DOE announced a plan on April 8 to provide machine learning and AI research with up to $30 million in funding in two areas: predictive modeling and simulation in physical sciences, as well as decision support in managing complex systems like those for cybersecurity and power grid resilience.
“Decision support really is computer programs that use large amounts of modeling and data to analyze multiple potential outcomes of a decision,” Helland said. “I think you see a lot of that in the COVID-19 research that we’re doing, where if you make a decision, you want to know what the ramifications of those decisions are.”
An AI algorithm at the center of the program helps understand uncertainty levels and risks associated with particular decisions.
While the algorithm being developed isn’t specifically for coronavirus research, it can potentially be used to model the spread of COVID-19, model outcomes and help position resources optimally, Helland said.