Briefing
|
- Customer Machine Learning Chips – Google published paper detailing performance and design of custom machine learning chips, called Tensor Processing Units (TPUs)
- Faster than GPUs and CPUs – Based on Google benchmarks, TPUs are 15 to 30 times faster for machine learning applications than standard GPUs and CPUs, such as Intel Haswell processors and Nvidia K80 GPUs
- More Energy Efficient – Also offers 30 to 80 times higher performance per watt than contemporary products
- TPU Background – Need for TPUs arose in 2011 when Google was integrating more deep learning models in products, with company first using TPUs in 2015
|
Accelerator
|
|
Sector
|
Information Technology
|
Organization
|
Google Inc.
|
Source
|
-
Lardinois, F., "Google says its custom machine learning chips are often 15-30x faster than GPUs and CPUs",
-
Jouppi, N., "Quantifying the performance of the TPU, our first machine learning chip",
-
Trader, T., "Google pulls back the covers on its first machine learning chip",
-
AcceleratingBiz analysis
|
Original Publication Date
|
April 5, 2017
|