NVIDIA Tesla GPUs will power Facebook’s next-generation deep learning machine

NVIDIA's new Tesla GPUs will be used to power Big Sur, Facebook's next-generation, open source deep learning computing system.

The NVIDIA Tesla M40 will power Facebook's Big Sur machine learning system. (Image Source: NVIDIA)

The NVIDIA Tesla M40 will power Facebook's Big Sur machine learning system. (Image Source: NVIDIA)

NVIDIA Tesla GPUs are about to help Facebook build a more intelligent next-generation computing system. The GPU manufacturer today announced that Facebook would be using its Tesla Accelerated Computing Platform to power its next-generation computing system, codenamed “Big Sur”, and drive a wide range of machine learning applications.

Because of their massive parallel architectures, GPU-accelerated computing platforms can significantly reduce the time needed to complete tasks like the training of complex deep neural networks. According to NVIDIA, this is something to the tune of 10 to 20x faster on its Tesla platform, which means that developers can now train more powerful and sophisticated networks that will deliver better capabilities to consumers.

Facebook will use NVIDIA Tesla M40 GPU acceleratorsintroduced just last month – to train its deep neural networks, making it the first company to use the new GPUs for this purpose.

Big Sur is Facebook’s newest Open Rack-compatible hardware designed for large-scale artificial intelligence computing, part of Facebook AI Research’s (FAIR) effort to advance our understanding and capabilities in the field of machine intelligence. It was built specifically with the NVIDIA Tesla M40 in mind and will incorporate eight GPUs of up to 300 watts each in a single rack. Thanks to the new Tesla GPUs, Big Sur is twice as fast as its predecessor, enabling Facebook to train neural networks in half the time and explore networks twice as large.

The Open Rack-compatible 8-GPU server. (Image Source: Facebook)

The Open Rack-compatible 8-GPU server. (Image Source: Facebook)

This means the development of more accurate models and new classes of advanced applications. Facebook already uses neural networks in some of its products and services, for instance its ability to automatically identify and tag pictures of you and your friends.

But in addition to providing more powerful capabilities for machine learning, GPUs exhibit architectural compatibility across generations, which allows for easier and more seamless upgrades in the future.

Facebook will also work with its partners to open source Big Sur specifications via the Open Compute Project, enabling AI researchers across the globe to collaborate and improve techniques. Perhaps a move motivated by Google’s open sourcing of TensorFlow, making Big Sur open source will nonetheless allow Facebook to benefit from things like lower costs arising from economies of scale if more companies adopt its designs and resource and information sharing.

Source: NVIDIA

Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.

Share this article