ASICs

This week, Google announced Cloud TPU beta availability on the Google Cloud Platform (GCP), accessible through their Compute Engine infrastructure-as-a-service. Using the second generation of Google’s tensor processing units (TPUs), the standard Cloud TPU configuration remains four custom ASICs and 64 GB of HBM2 on a single board, intended for accelerating TensorFlow-based machine learning workloads. With a leased Google Compute Engine VM, Cloud TPU resources can be used alongside current Google Cloud Platform CPU and GPU offerings. First revealed at Google I/O 2016, the original TPU was a PCIe-based accelerator designed for inference workloads, and for the most part, the TPUv1 was used internally. This past summer, Google announced the inference and training oriented successor, the TPUv2, outlining plans to incorporate it into their cloud...

Intel Shipping Nervana Neural Network Processor First Silicon Before Year End

This week at the Wall Street Journal’s D.Live 2017, Intel unveiled their Nervana Neural Network Processor (NNP), formerly known as Lake Crest, and announced plans to ship first silicon...

25 by Nate Oh on 10/18/2017

Google’s Tensor Processing Unit: What We Know

If you’ve followed Google’s announcements at I/O 2016, one stand-out from the keynote was the mention of a Tensor Processing Unit, or TPU (not to be confused with thermoplastic...

39 by Joshua Ho on 5/20/2016

The Rush to Bitcoin ASICs: Ravi Iyengar launches CoinTerra

Bitcoin is a topic at AnandTech we have carefully steered away from due to the ever changing state of the market and the opinions of that market. For...

51 by Ian Cutress on 8/27/2013

Log in

Don't have an account? Sign up now