HPC resources increasingly crucial for extending capabilities from search to discovery
Anyone remember the old Shake’n’Bake commercials where the parents exclaim over the breaded chicken at dinner and the child chimes in, “And I helped!”? In 2012, record-setting HPC server revenue was the tasty chicken and the little helper was Big Data.
HPC server revenue grew 7.7 percent over 2011, beating IDC’s 7.1 percent forecast to hit an all-time high of $11.1 billion. The supercomputers segment for systems sold at $500,000 and up continued as the heavy lifter. Supercomputer sales jumped 29.3 percent from the prior year to $5.6 billion worldwide, inching past 50 percent of all HPC server revenues. But 2012 was an unusually strong growth year for supercomputers, and that pace will slow.
The sub-$500,000 price bands fared less well. Growth in the divisional segment ($250,000 to $499,000) dipped 2.2 percent to finish 2012 at $1.2 billion. After posting a record $3.5 billion year in 2011, the departmental segment ($100,000 to $250,000) eased back in 2012 to $3.0 billion, still topping the 2009 mid-recession low point of $2.8 billion. Workgroup systems (sub-$100,000) modestly rebounded in 2012 to $1.24 billion, a 1.2 percent gain over 2011 revenue total but still far from the 2008 figure of $2.5 billion.
IDC forecasts that the overall HPC technical server market will experience a healthy 7.3 percent compound annual growth rate (CAGR) over the 2011 to 2016 forecast period, with revenues exceeding $14
billion by 2016.
The Little Helper: HPDA
What role does high performance data analysis (Big Data needing HPC) play in this global market? High-performance data analysis is the term IDC coined to describe the convergence of the established data-intensive HPC market based on modeling and simulation, and the high-end analytics market, which includes commercial firms moving up to HPC resources for the first time.
A good commercial example is PayPal, a multi-billion-dollar eBay company, which not long ago integrated HPC servers and storage into its datacenter workflow to perform sophisticated fraud detection on eBay and Skype transactions in real time. Real-time detection can catch fraud before it hits credit cards.
Another commercial adopter is GEICO, which is using HPC to perform weekly updates of insurance quotes for every eligible U.S. household and individual.
The common denominator underlying simulation- and analytics-based HPDA workloads is a degree of algorithmic complexity that is atypical for transaction processing-based business computing. With the help of sophisticated algorithms, HPC resources already are enabling established HPC users, as well as commercial adopters, to move beyond “needle in a haystack” searches in order to discover high-value, dynamic patterns. HPC resources will be increasingly crucial for extending “Big Data” capabilities from search to discovery.
IDC forecasts that revenue for HPC servers acquired primarily for HPDA use will grow robustly. HPDA-centric servers contributed $673 million to HPC revenues in 2011 and will contribute about $1.2 billion in 2015. Revenue for the whole HPDA ecosystem, including servers, storage and interconnects, software and services, should double the server figure alone.
Steve Conway is Research VP, HPC at IDC. He may be reached at editor@ScientificComputing.com.