Building Intelligence into Machine Learning Hardware

December 5, 2016 Ben Cotton Machine learning is a rising star in the compute constellation, and for good reason. It has the ability to not only make life more convenient – think email spam filtering, shopping recommendations, and the like – but also to save lives by powering the intelligence behind autonomous vehicles, heart attack prediction, etc. While the applications of machine learning are bounded only by imagination, the execution of those applications is bounded by the available compute resources. Machine learning is compute-intensive and it turns out that traditional compute hardware is not well-suited for the task. Many machine learning shops have approached the problem with graphics processing units (GPUs), application-specific integrated circuits (ASICs) – for example, Google TensorFlow – or field-programmable gate arrays (FPGAs) – for example, Microsoft’s…


Link to Full Article: Building Intelligence into Machine Learning Hardware

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!