Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

IBM announces supercomputer to propel sciences forward

By R&D Editors | November 17, 2011

IBM Blue GeneIQ

When Blue Gene/Q is fully deployed in 2012 at Lawrence Livermore National Laboratory (LLNL), the system, named “Sequoia”, is expected to achieve 20 petaflops at peak performance, marking it as one of the fastest supercomputers in the world. Image: IBM

IBM announced its next-generation
supercomputing project, Blue Gene/Q, will provide an ultra-scale technical
computing platform to solve the most challenging problems facing engineers and
scientists at faster, more energy efficient, and more reliable rates than before.
Blue Gene/Q is expected to predict the path of hurricanes, analyze the ocean
floor to discover oil, simulate nuclear weapons performance, and decode gene
sequences.

When it is fully deployed in 2012 at Lawrence Livermore
National Laboratory (LLNL), the system, named “Sequoia,” is expected to achieve
20 petaflops at peak performance, marking it as one of the fastest
supercomputers in the world. The capabilities this system represents will help
ensure United States
leadership in high-performance computing (HPC) and the science it makes
possible. Moreover, Blue Gene/Q is expected to become the world’s most
power-efficient computer, churning out 2 gigaflops per watt.

LLNL applies supercomputers to maintaining the nation’s aging nuclear deterrent without
testing, as well as such challenges as grid and network management, energy
research, and climate change. IBM will deploy 96 racks beginning as early as
December of this year.

“It is this emphasis on reliability, scalability, and low
power consumption that draws the interest of NNSA to this machine and its
architecture,” says Bob Meisner, head of NNSA’s Advanced Simulation and
Computing program. “This machine will provide an ideal platform to research and
develop strategies to assure that our most challenging codes run efficiently on
multi-core architectures. Such capabilities will provide information in
formulating our code development strategy as we face the challenges of exascale
simulation and advance the state of the art in simulation science, advances
necessary to ensure our nation’s security without nuclear testing.”

Announced earlier in 2011, Argonne National Laboratory (ANL)
will also implement Blue Gene/Q to stoke economic growth and improve U.S.
competitiveness for such challenges as designing electric car batteries,
understanding climate change, and exploring the evolution of the universe. The
10-petaflop system, named “Mira,” will provide a strong science and technology
engine that will fuel national innovation.

“At Argonne, we are already exploiting the power of Mira
through our Early Science program, which provides a broad range of researchers
the opportunity to work with IBM and Argonne technical staff to adapt their
codes to Mira’s unique architecture,” says Mike Papka, Deputy Associate
Laboratory Director, ANL. “This will ensure that Mira will be prepared to run
challenging computational science problems on the first day of operations.”

Delivering breakthrough performance by design

The third generation in the Blue Gene family of supercomputers, Blue
Gene/Q operates at an order-of-magnitude faster than previous systems,
deploying 16 multi-processing core technology and a scalable peak performance
up to 100 petaflops. Applicable to a growing set of computationally intensive
workloads within the scientific community, Blue Gene/Q is a suitable platform
for highly complex projects in a broad range of areas from nuclear energy to
climate modeling.

Designed with a small footprint and low power requirements,
Blue Gene/Q was ranked as the number-one most energy-efficient supercomputer in
the world by the Green500 (June 2011). It provides low-latency, high-performance
runs that simplify tracing errors and tuning performance, all based on an open
source and standards-based operating environment. Engineered with fewer moving
parts and built in redundancy, Blue Gene/Q has proven to be a class-leader in
reliability. Blue Gene/Q’s combination of high reliability and energy
efficiency makes it an economic supercomputing solution, with fewer failures
translating to faster times for more sound solutions.

“Completing computationally intensive projects for a wide
variety of scientific applications that were previously unsolvable is not just
possible—it is now probable,” says Brian Connors, vice president of technical
computing at IBM. “IBM’s historic role in developing the supercomputers that
provide the power behind critical applications across every industry has
uniquely positioned us to provide reliable supercomputing at the highest
level.”

The IBM PowerPC A2 processing architecture plays a key role
in delivering performance. Each processor includes 16 compute cores (up from
four used with Blue Gene/P, the previous system) plus a core allocated to
operating system administrative functions and a redundant spare core.

Blue Gene/Q incorporates architectural advances that
contribute to the system’s high performance and help simplify
programming. For example, hardware-based speculative execution capabilities
facilitate efficient multi-threading for long code sections, even those with
potential data dependencies. If conflicts are detected, the hardware can
backtrack and redo the work without affecting application performance.

In addition, hardware-based transactional memory helps
programmers avoid the potentially complex integration of locks and helps
eliminate bottlenecks caused by deadlocking—when threads become stuck during
the locking process. Hardware-based transactional memory helps to deliver
efficient and effective multi-threading while reducing the need for complicated
programming.

SOURCE

Related Articles Read More >

Probiotics power a bioresorbable battery that can run from 4 to 100+ minutes
Korean engineers show off ultra-light prosthetic hand with single-motor thumb
2025 R&D layoffs tracker tops 92,000
Eli Lilly facility
9 R&D developments this week: Lilly builds major R&D center, Stratolaunch tests hypersonic craft, IBM chief urges AI R&D funding
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2024 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE