SoC

This week, Google announced Cloud TPU beta availability on the Google Cloud Platform (GCP), accessible through their Compute Engine infrastructure-as-a-service. Using the second generation of Google’s tensor processing units (TPUs), the standard Cloud TPU configuration remains four custom ASICs and 64 GB of HBM2 on a single board, intended for accelerating TensorFlow-based machine learning workloads. With a leased Google Compute Engine VM, Cloud TPU resources can be used alongside current Google Cloud Platform CPU and GPU offerings. First revealed at Google I/O 2016, the original TPU was a PCIe-based accelerator designed for inference workloads, and for the most part, the TPUv1 was used internally. This past summer, Google announced the inference and training oriented successor, the TPUv2, outlining plans to incorporate it into their cloud...

ARM Aims at Intel, Cortex A15 Headed for Smartphones, Notebooks and Servers

Last month TI announced it was the first to license ARM’s next-generation Eagle core. Today, ARM is announcing the official name of that core: it’s the ARM Cortex A15. Architectural...

36 by Anand Lal Shimpi on 9/9/2010

TI First to License ARM's Next-Generation Eagle Core

In our smartphone and tablet reviews we make sure to spend a good amount of time talking about the silicon powering these devices. There’s no reason that handset and...

22 by Anand Lal Shimpi on 8/9/2010

Intel Unveils Moorestown and the Atom Z600, The Fastest Smartphone Platform?

When I wrote my first article on Intel's Atom architecture I called it The Journey Begins. I did so because while Atom has made a nice home in netbooks...

68 by Anand Lal Shimpi on 5/4/2010

Log in

Don't have an account? Sign up now