Google has begun to build its own custom application-specific integrated circuit ASIC chip called tensor processing units TPUs , Google chief executive Sundar Pichai said today at the Google I/O developer conference in Mountain View, California.
The name is inspired by Google s TensorFlow open source deep learning framework.
When you use the Google Cloud Platform, you can take advantage of TPUs as well, Pichai said.Specialty hardware — sort of taking a cue from the holographic processing unit HPU inside Microsoft s HoloLens augmented reality headset — will not be the only thing that will make the Google public cloud stand out from market leader Amazon Web Services AWS .Also, over time Google will expose more and more machine learning APIs, Pichai said.
Our goal is to lead the industry on machine learning and make that innovation available to our customers, Google distinguished hardware engineer Norm Jouppi wrote in a blog post.
Building TPUs into our infrastructure stack will allow us to bring the power of Google to developers across software like TensorFlow and Cloud Machine Learning with advanced acceleration capabilities.
Machine Learning is transforming how developers build intelligent applications that benefit customers and consumers, and we re excited to see the possibilities come to life.