Google Constructs Novel Chip To Make Machine Learning Faster

It all started as a stealthy project at Google several years ago to gauge what could be accomplished with their own custom accelerators for machine learning applications; and the result is a custom chip dubbed Tensor Processing Unit, or TPU, an ASIC named after the TensorFlow software which Google uses for its machine learning programs. The ASIC or application-specific integrated circuit, is specific to deep neural nets. These are networks of hardware and software that learn specific tasks by analyzing vast amounts of data. ​ This technology has been imperative for the revamp of the search engine. The accelerator chip, speeds up a specific task providing better performance per watt than existing chips for machine learning tasks. Owing to this more operations per second can be squeezed into the silicon,…

Link to Full Article: Google Constructs Novel Chip To Make Machine Learning Faster

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!