Research & Development World

  • Home Page
  • Topics
    • Aerospace
    • Archeology
    • Automotive
    • Biotech
    • Chemistry
    • COVID-19
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Market Pulse
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
      • Software
    • Semiconductors
  • 2021 R&D 100 Award Winners
    • R&D 100 Awards
    • 2020 Winners
    • Winner Archive
  • Resources
    • Digital Issues
    • Podcasts
    • Subscribe
  • Global Funding Forecast
  • Webinars

Computer-based weather forecast: New algorithm outperforms mainframe computer systems

By Heather Hall | February 13, 2020

Use of SPA ensures that errors in temperature forecast are reduced significantly in comparison with those of other procedures   ill./©: Illia Horenko

The exponential growth in computer processing power seen over the past 60 years may soon come to a halt. Complex systems such as those used in weather forecasting, for example, require high computing capacities, but the costs for running supercomputers to process large quantities of data can become a limiting factor. Researchers at Johannes Gutenberg University Mainz (JGU) in Germany and Università della Svizzera italiana (USI) in Lugano, Switzerland, have recently unveiled an algorithm that can solve complex problems with remarkable facility – and they can do so even on a personal computer.

Exponential growth in IT will reach its limit

In the past, we have seen a constant rate of acceleration in information processing power as predicted by Moore’s Law, but it now looks as if this exponential rate of growth is limited. New developments rely on artificial intelligence and machine learning, but the related processes are largely not well-known and understood. “Many machine learning methods, such as the very popular deep learning, are very successful, but work like a black box, which means that we don’t know exactly what is going on. We wanted to understand how artificial intelligence works and gain a better understanding of the connections involved,” said Professor Susanne Gerber, a specialist in bioinformatics at Mainz University. Together with Professor Illia Horenko, a computer expert at Università della Svizzera italiana and a Mercator Fellow of Freie Universität Berlin, she has developed a technique for carrying out incredibly complex calculations at low cost and with high reliability. Gerber and Horenko, along with their co-authors, have summarized their concept in an article entitled “Low-cost scalable discretization, prediction, and feature selection for complex systems” recently published in Science Advances. “This method enables us to carry out tasks on a standard PC that previously would have required a supercomputer,” emphasized Horenko. In addition to weather forecasts, the researchers see numerous possible applications such as solving classification problems in bioinformatics, image analysis, and medical diagnostics.

Breaking down complex systems into individual components

The paper presented is the result of many years of work on the development of this new approach. According to Gerber and Horenko, the process is based on the Lego principle, according to which complex systems are broken down into discrete states or patterns. With only a few patterns or components, i.e., three or four dozen, large volumes of data can be analyzed and their future behavior can be predicted. “For example, using the SPA algorithm we could make a data-based forecast of surface temperatures in Europe for the day ahead and have a prediction error of only 0.75° C,” said Gerber. It all works on an ordinary PC and has an error rate that is 40% better than the computer systems usually used by weather services, while also being much cheaper.

SPA or Scalable Probabilistic Approximation is a mathematically based concept. The method could be useful in various situations that require large volumes of data to be processed automatically, such as in biology, for example, when a large number of cells need to be classified and grouped. “What is particularly useful about the result is that we can then get an understanding of what characteristics were used to sort the cells,” added Gerber. Another potential area of application is neuroscience. Automated analysis of EEG signals could form the basis for assessments of cerebral status. It could even be used in breast cancer diagnosis, as mammography images could be analyzed to predict the results of a possible biopsy.

“The SPA algorithm can be applied in a number of fields, from the Lorenz model to the molecular dynamics of amino acids in water,” concluded Horenko. “The process is easier and cheaper and the results are also better compared to those produced by the current state-of-the-art supercomputers.”

The collaboration between the groups in Mainz and Lugano was carried out under the aegis of the newly created Research Center Emergent Algorithmic Intelligence, which was established in April 2019 at JGU and is funded by the Carl Zeiss Foundation.

 

 

 

Comments

  1. Jack Horner says

    February 21, 2020 at 1:22 pm

    Good article about working smarter.

    Reply
  2. Michael Zahm says

    February 27, 2020 at 2:13 am

    Interesting to see an algorithm that can bypass heavy computer requirements yet providing very comparable results. The obvious advantage is the traceability in the calculation.

    Reply
  3. Joe Stoner says

    March 30, 2020 at 10:05 am

    Very interesting but: How do we get our hands on the algorithm?

    Reply
  4. William Tucker says

    July 9, 2020 at 7:13 pm

    To do a good job with Big Data, it needs to be viewed as wind.

    Catching the wind in a sail is how you capture big data effectively…..but you have to change the classification of the data to “what is driving the wind” to capture it…..the data itself is irrelevant….like the molecules of air…..it is identifying and bringing online pertinent data by “naming the driver” and refocusing the examination….so the search should be on being able to name the driver(s) in any instance(s)….. not relying on theories but learning how to capture the wind…..by learning to examine data as wind….

    see also: analysis of a response to a pandemic on the stock market as the global market place dissolves.

    see also: “Low-cost scalable discretization, prediction, and feature selection for complex systems”

    Reply

Tell Us What You Think! Cancel reply

Related Articles Read More >

Tech companies team up to detect defects in manufacturing with quantum computing
R&D 100 winner of the day: Plug-N-Play Appliance for Resilient Response of Operational Technologies (PARROT)
Q-CTRL introduces quantum sensing division to meet market demands
SwRI develops automated shuttle for campus tours, research
2021 R&D Global Funding Forecast

Need R&D World news in a minute?

We Deliver!
R&D World Enewsletters get you caught up on all the mission critical news you need in research and development. Sign up today.
Enews Signup

R&D World Digital Issues

February 2020 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R& magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • 2022 Global Funding Forecast

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • Home Page
  • Topics
    • Aerospace
    • Archeology
    • Automotive
    • Biotech
    • Chemistry
    • COVID-19
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Market Pulse
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
      • Software
    • Semiconductors
  • 2021 R&D 100 Award Winners
    • R&D 100 Awards
    • 2020 Winners
    • Winner Archive
  • Resources
    • Digital Issues
    • Podcasts
    • Subscribe
  • Global Funding Forecast
  • Webinars