Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

World’s Largest Artificial Neural Network

By R&D Editors | June 27, 2013

LEIPZIG, Germany — ISC 2013 — Computer-based neural networks are capable of “learning” how to model the behavior of the brain -– including recognizing objects, characters, voices and audio in the same way that humans do.

Yet creating large-scale neural networks is extremely computationally expensive. For example, Google used approximately 1,000 CPU-based servers, or 16,000 CPU cores, to develop its neural network, which taught itself to recognize cats in a series of YouTube videos. The network included 1.7 billion parameters, the virtual representation of connections between neurons.

In contrast, the Stanford team, led by Andrew Ng, director of the university’s Artificial Intelligence Lab, created an equally large network with only three servers using NVIDIA GPUs to accelerate the processing of the big data generated by the network. With 16 NVIDIA GPU-accelerated servers, the team then created an 11.2 billion-parameter neural network — 6.5 times bigger than a network Google announced in 2012.

The bigger and more powerful the neural network, the more accurate it is likely to be in tasks such as object recognition, enabling computers to model more human-like behavior. A paper on the Stanford research was published at the International Conference on Machine Learning.

“Delivering significantly higher levels of computational performance than CPUs, GPU accelerators bring large-scale neural network modeling to the masses,” said Sumit Gupta, general manager of the Tesla Accelerated Computing Business Unit at NVIDIA.  “Any researcher or company can now use machine learning to solve all kinds of real-life problems with just a few GPU-accelerated servers.

GPU Accelerators Power Machine Learning

Machine learning, a fast-growing branch of the artificial intelligence (AI) field, is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, effective web search and a vastly improved understanding of the human genome. Many researchers believe that it is the best way to make progress towards human-level AI.

One of the companies using GPUs in this area is Nuance, a leader in the development of speech recognition and natural language technologies.Nuance trains its neural network models to understand users’ speech by using terabytes of audio data. Once the models are trained, they can then recognize the pattern of spoken words by relating them to the patterns that the model learned earlier.

“GPUs significantly accelerate the training of our neural networks on very large amounts of data, allowing us to rapidly explore novel algorithms and training techniques,” said Vlad Sejnoha, chief technology officer at Nuance. “The resulting models improve accuracy across all of Nuance’s core technologies in healthcare, enterprise and mobile-consumer markets.

About NVIDIA

Since 1993, NVIDIA (NASDAQ: NVDA) has pioneered the art and science of visual computing. The company’s technologies are transforming a world of displays into a world of interactive discovery — for everyone from gamers to scientists, and consumers to enterprise customers. More information at http://nvidianews.nvidia.com and http://blogs.nvidia.com.

 

Related Articles Read More >

QED-C outlines road map for merging quantum and AI
Quantum computing hardware advance slashes superinductor capacitance >60%, cutting substrate loss
Hold your exaflops! Why comparing AI clusters to supercomputers is bananas
Why IBM predicts quantum advantage within two years
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2024 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE