In 2013, the term big data continued to dominate as a source of technology challenges, experimentation and innovation. It’s no surprise then that many business and IT executives are suffering from big data exhaustion, causing Gartner to deem 2013 as the year the technology entered the “Trough of Disillusionment.”
However, 2013 also saw early signs of the reentry of high performance computing (HPC) within the enterprise. The technology has a legacy within the scientific and research community, but big data served as a catalyst for HPC adoption within traditional enterprises. While the term big data might have dipped into disillusionment, HPC holds the promise of guiding big data towards business growth and productivity.
At SGI, our customers have quickly realized the benefits of pairing HPC with big data. One prime example is PayPal. The leader in online payments, PayPal processes 13 million financial transactions a day and deployed SGI’s UV platform and HPC technology to ensure fraud was caught early. In the first year, SGI analyzed PayPal’s big data and identified $710 million in fraud that would have otherwise been undetected, identifying suspicious patterns across separate sumptuous transactions. As the company grew, real-time analytics was no longer enough — technology tools have expanded to include predictive analytics as a key competitive advantage for companies such as PayPal. Big data analytics, as a whole, has grown in importance by providing unprecedented and intuitive insights into an array of business functions such as sales, supply chain and network operations.
In 2014, key HPC capabilities will enhance the level of intelligence and speed big data analytics can provide for the enterprise, including:
- Graphing and mapping: HPC-powered data mapping and graphing will lead to greater accuracy in business forecasting
- Pattern visualizations: HPC-powered tools will emerge that can provide an intuitive view of complex data sets, enabling rapid identification of relationships for simple analysis
- Scaling in-memory databases: HPC will allow enterprise in-memory systems to handle larger data workloads — allowing closer to complete data sets (over partial sets) to benefit from real-time analytics while in motion
- Meta-data: The importance of meta-data will jump dramatically — we’ll see enterprises realize leveraging meta-data analytics for virtualization and relational mapping can yield enhanced accuracy, new business insights and even reveal security threats
Overall, 2014 will be the year when HPC elevates its status within the enterprise — becoming a must-have business technology to extract the largest value from the largest amounts of data. Arguably, the biggest issue today is the failure to recognize the true scope of available relevant data. Big data analytics is most effective when it combines not only internal structured and unstructured data, but pairs this with externally available data from all information sources ranging such as social, market, web and sensor data. By pairing HPC with big data, companies will maximize the intelligence from all these sources, processing data at high volumes, with the speed and accuracy enterprises need to continue to thrive.
Jorge Titinger is CEO, SGI. He may be reached at editor@ScientificComputing.com.