NVIDIA Introduces dual Cortex A9 based Tegra 2by Anand Lal Shimpi on January 7, 2010 2:00 PM EST
- Posted in
A month ago NVIDIA shared this slide with me:
It's a graph of the total available market, according to NVIDIA, for its Tegra SoC (System-on-Chip). This year alone NVIDIA estimates that there's around a $4B market for Tegra. Next year it grows to $6B. By 2013 the total available market for NVIDIA's Tegra SoC reaches over $10B. That's more money than NVIDIA ever made from the PC market.
In order to compete in that space you need a competent chip. Today NVIDIA is announcing its second generation Tegra SoC. It's creatively named the Tegra 2 and this is what it looks like in block diagram form:
The SoC is made up of 8 independent processors, up from 7 in the original Tegra. The first two are the most exciting to me - a pair of ARM Cortex A9 cores. These are dual-issue out of order cores from ARM running at up to 1GHz. If you thought the A8 was fast, these things should be much faster.
The original Tegra used a single ARM11 core. It was multi-core capable but the only version NVIDIA ever shipped only had a single ARM11. By now you know that ARM11 is unreasonably slow and thus you see my biggest problem with Tegra 1. Tegra 2 addresses this in a grand way. NVIDIA skipped over Cortex A8 entirely and went to what it believes is a more power efficient, higher performing option with the A9. I'll go deeper into the A9's architecture shortly, but to put it bluntly - A8 is dead in my eyes, Cortex A9 is what you want.
The next processor is an audio decode core. NVIDIA acquired PortalPlayer in 2007 for somewhere around $350M. PortalPlayer SoCs were used in the first five generations of iPods. PortalPlayer contributed to much of NVIDIA's know how when it came to building SoCs and audio decoders. NVIDIA is particularly proud of its audio decode core, claiming that it can deliver system power in the low 10s of mW while playing an MP3. It's difficult to quality that claim. Microsoft lists Zune HD battery life at 33 hours while playing MP3s, while Apple claims the iPod Touch can do the same for 30 hours. Is NVIDIA responsible for the Zune's longer MP3 playback battery life? I've got no clue.
Given that this isn't 1995, audio decoding isn't very hard nor very interesting so let's move on. The next two cores are for video encode and decode. On the encode side NVIDIA claims to be able to accelerate the encode of 1080p H.264 video. This is up from 720p in the original Tegra and particularly important for any handsets that might include a video camera. Bitrates, power consumption and other pertinent details remain unknown.
The video decode side is where NVIDIA believes it has an advantage. Tegra's video decode processor accelerates up to 1080p high profile H.264 video at bitrates in the 10s of megabits per second. The Samsung SoC in the iPhone 3GS is limited to only 480p H.264 decode despite Samsung claiming 1080p decode support on its public Cortex A8 SoC datasheets. NVIDIA insists that no one else can do 1080p decode at high bitrates in a remotely power efficient manner. Tegra's 1080p decode can be done in the low 100s of mW. NVIDIA claims that the competition often requires well over 1W of total system power to do the same because they rely on the CPU to do some of the decoding. Again, this is one of those difficult to validate claims. Imagination has demonstrated very low CPU utilization 1080p H.264 decode on its PowerVR SGX core, but I have no idea of the power consumption.
NVIDIA's numbers are interesting, but not 3rd party verified
So let's see, that's two ARM Cortex A9 cores, an audio core, video encode and video decode - we're up to five at this point. The next processor is used for image signal processing. In other words it's the core that drives a still/video camera in a Tegra handset. The processor supports up to 12MP sensors, auto whitebalance, auto focus and general video processing on either a still picture or a video stream. The output can be routed to the next core: Tegra 2's GeForce GPU.
NVIDIA wasn't willing to say much about Tegra's graphics core other than it was their own design. NVIDIA confirmed that the only 3rd party IP in Tegra 2 are the ARM cores, the rest is made in house. And if you were wondering, Tegra 2 is the other platform that Epic demonstrated its Unreal Engine 3 mobile technology on.
The GPU in Tegra 2 is the same architecture as Tegra 1 (OpenGL ES 2.0 is supported), just higher performance. NVIDIA expects a 2 - 3x performance increase thanks to improved efficiency, more memory bandwidth and a higher clock rate.
The original Tegra only supported LPDDR1, while Tegra 2 supports LPDDR2. The Zune HD's Tegra SoC had a 32-bit 333MHz datarate LPDDR1 memory bus, resulting in 1.33GB/s of memory bandwidth. Tegra 2 in a single package with integrated memory should deliver about twice that.
NVIDIA's believes while other SoC makers can promise higher theoretical performance, Tegra and Tegra 2 deliver better real world gaming performance thanks to everything from the hardware to the software stack. Given NVIDIA's experience in optimizing desktop GPU drivers, I've got no problems giving NVIDIA the benefit of the doubt here.
Tegra 1 was able to run Quake 3 at 720p with AA at over 40 fps, which according to NVIDIA was faster than any other SoC in a handset today. I haven't personally benchmarked Quake 3 on any SoCs so I can't really validate that claim either.
Ok, only one processor left and this one is simple. Tegra 2 (like Tegra) has an ARM7 processor that is used for chip management. It handles dataflow, power management and other similar tasks.
You'll notice the one thing missing from NVIDIA's Tegra 2 is a cellular modem. There simply isn't one. NVIDIA's philosophy is to focus on the core compute functions of an SoC that require no carrier or FCC testing. An OEM could mate a Tegra 2 with a tried and true modem, lose out on the integration side but win in time to market. Given the sheer number of different wireless networks in the world, leaving the modem out of the design makes sense to me. But then again I don't make smartphones. It may prevent Tegra 2 from going into the cheapest solutions, but that's not where NVIDIA wants to be in any case.
Post Your CommentPlease log in or sign up to comment.
View All Comments
sprockkets - Friday, January 8, 2010 - linkIt's a loser called DLeRium who thinks I don't know the diff btw x86 and PPC or ARM.
Perhaps he doesn't know I compile code from scratch all the time for Linux and most of that code easily compiles on multiple archs. That doesn't mean its automatic to go from x86 to ARM, but it isn't an insurmountable hurdle either.
sprockkets - Friday, January 8, 2010 - linkHasn't stopped Doom or Quake 3 from arriving on the iphone, has it?
SilthDraeth - Friday, January 8, 2010 - linkWoW runs in OpenGL as well.
rennya - Friday, January 8, 2010 - linkAnd that's why Quake 3 is used in the demo. They are easily portable, unlike WoW or Sims games.
Now if we have WoW or Sims games on iPhone you may have a point.
sprockkets - Friday, January 8, 2010 - linkWoW is a ways off perhaps, but again, my original point was it is on OSX, which doesn't have DirectX. The previous poster was going on about x86 vs. ARM limitations.
UT2004 had DirectX and OpenGL versions.
For that matter, I believe the real question is, why would you want to game on a 3.7-5 inch screen? Even if you used video out, you would then end up hooking up some form of controller, and you'd be back to square one where you were with a computer.
But hey, surprise me. Boxee has the Tegra2 in it. Linux is a common denominator in phones and with ARM and OpenGL ES.
Genx87 - Friday, January 8, 2010 - linkHow big at Nintendo DS screens? Those seem to sell at a clip higher than the stand alone consoles.
puffpio - Thursday, January 7, 2010 - linkMy next smartphone better have this...as well as my future HTPC/NASbox/fileserver/DNLA server/torrentbox all in one device thingy
puffpio - Thursday, January 7, 2010 - linkand said smartphone needs hdmi out to drive a 1080p display
Goty - Thursday, January 7, 2010 - linkI can find solutions that perform "well enough" until NVIDIA decides to stop being one of the most underhanded and disrespectful companies I can think of.
sprockkets - Thursday, January 7, 2010 - link" But rest assured that if you're buying a smartphone in 2010, it's not Snapdragon that you want but something based on Cortex A9"
LOL, now we are basing our smartphone purchases like we did 10 years ago with computers?
OK, let's wait for Tegra2 so we can watch an 8-12GB pirated 1920x800 HDx264 hi profile movie on an 800x480 screen with headphones or even worse, a 2 inch tinny speaker even though it has DTS audio. LOL.