Google announced at the I/O developer’s conference, that it designed its own chip for deep neural networks. Sundar Pichai spoke about a custom-built chip that helps give Google its edge in machine learning and artificial intelligence. The chip, dubbed a TPU or Tensor Processing Unit (in keeping with Google’s A.I. platform TensorFlow), is specifically produced for running Google’s decision-making algorithms.
Most companies like Facebook and Microsoft use GPUs for their machine learning and artificial intelligence.
CEO Sundar Pichai said that Google has designed an application-specific integrated circuit (ASIC) aimed to drive deep neural nets, an artificial intelligence (AI) technology that is reshaping the Internet. These are networks of software and hardware that analyze a vast amount of data in order to learn specific tasks.
Google uses neural nets to recognize voice commands in Android phones, identify faces and objects in photos, or translate text from one language to another. Even the Google search engine is transformed by the applications of this new technology.
As it underpins TensorFLow, Google’s software engine that drives its deep learning services, Google calls its chip the Tensor Processing Unit (TPU). TensorFlow was released by Google this past fall, under an open-source license. Any developer outside the company can use and modify the software program.
Google said in a blog post that TPU is tailored to machine learning applications, requiring fewer transistors per operation and being more tolerant of reduced computational precision. Google has not shared the designs for the TPU, but outsiders can use the company’s machine learning software and hardware via various cloud services.
The new TPU built by Google is a major leap forward for an intelligent application. By creating its own custom chip, the company has taken a big improvement with the speed of its machine learning system.
Other companies have also incorporated deep learning into a wide range of Internet services, including Twitter, Microsoft, and Facebook. The neural nets are typically driven with graphics processing units (GPUs) made by companies like Nvidia. But some companies are also exploring the use of field programmable gate arrays (FPGAs) that can be programmed for specific tasks.
You can read the full content here.