Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • 2025 R&D 100 Award Winners
    • 2025 Professional Award Winners
    • 2025 Special Recognition Winners
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
  • Resources
    • Research Reports
    • Digital Issues
    • Educational Assets
    • R&D Index
    • Subscribe
    • Video
    • Webinars
    • Content submission guidelines for R&D World
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

GCS Grants Hundreds of Millions of Computing Core Hours to National Science Projects

By Gauss Centre for Supercomputing | November 12, 2018

Hazel Hen at the High-Performance Computing Center Stuttgart, one of the Gauss Centre for Supercomputing’s three Tier-0 high-performance computing systems, along with JUWELS at the Jülich Supercomputing Centre and SuperMUC at the Leibniz Supercomputing Centre. (Image: Julian Herzog via Wikimedia Commons, https://creativecommons.org/licenses/by/4.0/)

Computing time utilization has started for a new set of leading-edge, large-scale research projects on the Gauss Centre for Supercomputing’s (GCS) three Tier-0 high-performance computing (HPC) systems. GCS leadership approved 816.3 million core hours as part of the organization’s 20th Call for Large-Scale Projects—research projects that require a minimum of 35 million core hours of computing time over a period of twelve months—for 13 simulation projects that met the strict qualification criteria set by the GCS Steering Committee. The computing time grants support national research activities from the fields of Computational and Scientific Engineering (351.3 million core hours), Astrophysics (247.5 million core hours), and High Energy Physics (217.5 million core hours).

The newest large-scale projects are divided up between the three GCS HPC systems: Hazel Hen at the High-Performance Computing Center Stuttgart (HLRS), JUWELS at the Jülich Supercomputing Centre (JSC), and SuperMUC at the Leibniz Supercomputing Centre, Garching near Munich (LRZ).

At JSC, users can already make full use of the first module of the new JUWELS supercomputer, which went into operation this past summer. JUWELS will consist of multiple, architecturally diverse but fully integrated Tier-0 modules designed for specific simulations and data science tasks that can be combined dynamically and flexibly depending on user requirements. The current installation delivers a peak performance of 12 petaflops. In the next year and a half, JUWELS will be extended by a Booster module for massively parallel applications which will significantly increase the system performance.

Although the HPC landscape at LRZ currently is in transition, the lion’s share of computing time grant of the 20th GCS Large-Scale Call—more than 500 million core hours—is delivered by LRZ’s SuperMUC installation. While SuperMUC-NG, LRZ’s “next generation” supercomputer, is currently undergoing first test runs and will be fully operational in the coming weeks, users can continue to leverage the computing power of the current SuperMUC Phase II installation until the end of 2019. SuperMUC-NG, which is based on the Intel Xeon Scalable Processor and is connected by Intel’s Omni-Path network, will be capable of more than 26 petaflops when it comes online.

In the context of the 20th GCS Call, the third GCS centre, HLRS, once more underscored its role as a leading HPC institution supporting research in the field of scientific engineering. HLRS’s Cray XC40 system Hazel Hen is the only GCS HPC system supporting a simulation project with an individual computing time allocation of over 100 million core hours. The project “Analysis of Turbulent Flows and Prediction of Aeroacoustic Sound Fields: Chevron nozzle optimization, active friction drag reduction, and control of shock-induced separation”, run by the Institute of Aerodynamics and Fluid Dynamics at RWTH Aachen University, is supported with a total of 136.1 million core hours on the Stuttgart HPC system.

The GCS Calls for Large-Scale Projects application procedure and criteria for decision is described in detail here.

Related Articles Read More >

Quantum computing advance: Atom-placed silicon lattice hits 15,000 quantum dots, reveals metal-insulator transition
Lawrence Livermore’s “Science on Saturday” returns, linking computing research with classroom learning
Qunova’s HI-VQE quantum chemistry algorithm is now on AWS Marketplace
Sandia unveils Spectra, a reconfigurable supercomputer for nuclear stockpile simulations
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2025 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

R&D 100 Awards
Research & Development World
  • Subscribe to R&D World Magazine
  • Sign up for R&D World’s newsletter
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2026 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • 2025 R&D 100 Award Winners
    • 2025 Professional Award Winners
    • 2025 Special Recognition Winners
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
  • Resources
    • Research Reports
    • Digital Issues
    • Educational Assets
    • R&D Index
    • Subscribe
    • Video
    • Webinars
    • Content submission guidelines for R&D World
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE