[News] Google benchmarks its Tensor Processing Unit (TPU) chips
http://hexus.net/tech/news/industry/...nit-tpu-chips/
Quote:
As Google relies heavily on compute-intensive machine learning for its core activities it has designed and rolled out its own Tensor Processing Unit (TPU) accelerator chips in recent years. The need for such a chip became evident about six years ago with the growth in computational expense due to technologies like Google voice search taxing the deep neural net speech recognition systems.
Quote:
- On our production AI workloads that utilize neural network inference, the TPU is 15x to 30x faster than contemporary GPUs and CPUs.
- The TPU also achieves much better energy efficiency than conventional chips, achieving 30x to 80x improvement in TOPS/Watt measure (tera-operations [trillion or 1012 operations] of computation per Watt of energy consumed).
- The neural networks powering these applications require a surprisingly small amount of code: just 100 to 1500 lines. The code is based on TensorFlow, our popular open-source machine learning framework.
http://hexus.net/media/uploaded/2017...c1cb750ced.jpg
http://hexus.net/media/uploaded/2017...158239799e.png
http://hexus.net/media/uploaded/2017...71772c381a.png
http://hexus.net/media/uploaded/2017...666741d66a.png
http://hexus.net/media/uploaded/2017...83c2933743.png