Google claims its AI supercomputer to be faster, greener than NVIDIA


By MYBRANDBOOK


Google claims its AI supercomputer to be faster, greener than NVIDIA

Alphabet Inc’s Google has released new details about the supercomputers it uses to train its artificial intelligence models. It says that the systems are both faster and more power-efficient than comparable systems from Nvidia Corp. Google has designed its own custom chip called the Tensor Processing Unit, or TPU. It uses those chips for more than 90% of the company’s work on artificial intelligence training.

 

The Google TPU is now in its fourth generation. In a scientific paper published, Google detailed how it has strung more than 4,000 of the chips together into a supercomputer using its own custom-developed optical switches to help connect individual machines.

 

Improving these connections has become a key point of competition among companies that build AI supercomputers because so-called large language models that power technologies like Google’s Bard or OpenAI’s ChatGPT have exploded in size, meaning they are far too large to store on a single chip.

 

Google said its supercomputers make it easy to reconfigure connections between chips on the fly, helping avoid problems and tweak for performance gains.

 

“Circuit switching makes it easy to route around failed components,” Google Fellow Norm Jouppi and Google Distinguished Engineer David Patterson wrote in a blog post about the system. “This flexibility even allows us to change the topology of the supercomputer interconnect to accelerate the performance of an ML (machine learning) model.”

 E-Magazine 
 VIDEOS  Placeholder image

Copyright www.mybrandbook.co.in @1999-2024 - All rights reserved.
Reproduction in whole or in part in any form or medium without express written permission of Kalinga Digital Media Pvt. Ltd. is prohibited.
Other Initiatives : www.varindia.com | www.spoindia.org