News

Researchers Deploy GPUs to Build World's Largest Artificial Neural Network

Researchers Deploy GPUs to Build World's Largest Artificial Neural Network

NVIDIA today announced that it has collaborated with a research team at Stanford University to create the world’s largest artificial neural network built to model how the human brain learns. The network is 6.5 times bigger than the previous record-setting network developed by Google in 2012.

Computer-based neural networks are capable of “learning” how to model the behavior of the brain – including recognizing objects, characters, voices, and audio in the same way that humans do.

Yet creating large-scale neural networks is extremely computationally expensive. For example, Google used approximately 1,000 CPU-based servers, or 16,000 CPU cores, to develop its neural network, which taught itself to recognise cats in a series of YouTube videos. The network included 1.7 billion parameters, the virtual representation of connections between neurons.

In contrast, the Stanford team, led by Andrew Ng, director of the university’s Artificial Intelligence Lab, created an equally large network with only three servers using NVIDIA GPUs to accelerate the processing of the big data generated by the network. With 16 NVIDIA GPU accelerated servers, the team then created an 11.2 billion-parameter neural network – 6.5 times bigger than a network Google announced in 2012.

The bigger and more powerful the neural network, the more accurate it is likely to be in tasks such as object recognition, enabling computers to model more human-like behavior. A paper on the Stanford research was published yesterday at the International Conference on Machine Learning.

GPU Accelerators Power Machine Learning

Machine learning, a fast-growing branch of the artificial intelligence (AI) field, is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, effective web search and a vastly improved understanding of the human genome. Many researchers believe that it is the best way to make progress towards human-level AI.

One of the companies using GPUs in this area is Nuance, a leader in the development of speech recognition and natural language technologies. Nuance trains its neural network models to understand users’ speech by using terabytes of audio data. Once the models are trained, they can then recognize the pattern of spoken words by relating them to the patterns that the model learned earlier.

NVIDIA will be exhibiting at the 2013 International Supercomputing Conference (ISC) in Leipzig, Germany this week, June 16-20, at booth #220.

All News Categories

News for Past 12 Months

Subscribe to HWZ Here!

Subscribe now to receive latest tech news, articles and promotions straight to your inbox!
 
 
By signing up, you indicate that you have read and agreed to the and .