For decades, IBM has been collaborating with the U.S. government in deploying high performance computers in the national laboratories and government agencies that help the country retain its leadership in science and commerce, as well as safeguarding national security.
That’s why we were so pleased when President Obama issued an executive order establishing the National Strategic Computing Initiative (NSCI) with the goal of ensuring that the United States leads in the field of high performance computing. The initiative is aimed at producing computers capable of exascale performance — which is one billion billion operations per second, orders of magnitude faster than today’s most powerful computers. What’s clear is the Administration views advances in high performance computing as essential for the United States to remain globally competitive.
While the high-level goals of the NSCI are clear, we believe that the next step should be for the government to collaborate with leaders of industry and academia to identify a set of high-priority grand challenges that government, universities and businesses can take on together.
Traditionally, the focus within the high performance computing community has been on optimizing systems to handle hardcore scientific problems — stressing modeling and simulation. But with the emergence of big data, researchers in diverse domains such as healthcare, genomics, financial analytics, and social behavior see the need as well for the analysis and visualization of large and complex data sets. They need systems that help them manage and analyze data to produce deeper insights. The high performance computing systems of the future must be able to handle both kinds of computing challenges.
So, it’s essential for the government to engage business leaders from a variety of critical industries to develop projects that will address the challenges and opportunities faced by industries, even while they push forward with new computing architectures.
I can’t speak for the industry leaders, but I can suggest a few pursuits that would be worthy of government-industry collaboration:
- Personalized medicine: The emergence of inexpensive gene sequencing systems makes it practical for the first time for physicians to understand how a disease — cancer, for instance — is connected to an individual’s DNA. As a result, it’s becoming possible to design therapies that are custom-tailored to that individual. But this work is data-intensive and requires a tremendous amount of computing resources, plus new ways of managing and analyzing data.
- Energy production: New extraction technologies make it easier to get oil and gas out of the ground. We can use high performance computing to analyze energy deposits and to help engineers determine the best ways to tap them — being mindful not just of economics, but of health and environmental concerns. We can also use powerful computers to optimize the way different forms of energy, including alternative energy sources, are put to work in the economy.
- Public safety: Police and security organizations are challenged when faced with seemingly random attacks by “lone wolf” gunmen. But what if they could detect patterns of behavior that signal that an individual might be considering or planning such an attack? By monitoring social networks and matching potentially telling signals with police records about individuals in real time, it could be possible to predict who might be planning an attack, and even where and when the attack might come.
These are applications where powerful computers can make a huge difference, but it’s not enough just to use the computer architectures that already exist and to simply speed them up or add more computing resources. You can’t brute-force these problems. Instead, it’s necessary to develop a bold new approach, which we at IBM call data-centric computing. This approach addresses both the modeling and simulation applications that are the traditional focus of the high performance computing community and today’s new applications in big data analytics and cognitive computing.
In data-centric computers, much of the processing will move to where the data resides, whether that’s within a single computer, in a network or out on the cloud. Microprocessors will still be vitally important, but their work will be divided up among a variety of specialized chips.
This is the beginning of one of the most significant shifts in the history of computing, and we at IBM look forward to collaborating with our colleagues in government, industry and academia to turn the concept of data-centric computing into a full-blown reality. The National Strategic Computing Initiative is the perfect vehicle for developing the new architecture and for taking on the grand computing challenges that will move business and society forward.
To learn more about the new era of computing, read Smart Machines: IBM’s Watson and the Era of Cognitive Computing.
David Turek is Vice President, High Performance Computing at IBM Corporation. He may be reached at editor@ScientificComputing.com.