Introduction

If desktop graphics hardware can be more than a little confusing, deciphering performance of mobile graphics parts can be (and has historically been) an absolute nightmare. Way back in the day it was at least fairly easy to figure out which desktop chip was hiding in which mobile kit, but both AMD and NVIDIA largely severed ties between mobile and desktop branding. They may not want to readily admit that, and in the case of certain models they still pretty heavily rely on the cachet associated with their desktop hardware, but it's by and large true. So to help you make sense of mobile graphics, we present to you the first in what will hopefully be a regular series of guides.

I started putting guides like this one together back at my alma mater NotebookReview, and they've always been pretty well-received. It's really not hard to understand why: while NVIDIA and AMD are usually pretty forthcoming with the specs of their desktop kit, they've historically been pretty cagey about their notebook graphics hardware. As a result, sites like this one have had to sift through information about different laptops, compare notes with other sites and readers, and eventually compile the data. Forums will light up with questions like "can this laptop play xyz?"

Thankfully, the advent of DirectX 11 drastically simplified my job. Whenever shader models or even entire DirectX versions were bifurcated, complication followed suit, but with DirectX 11 pretty much everybody is on board with the same fundamental feature sets, and AMD and NVIDIA both support their respective technologies across the board. Intel remains the odd man out, as you'll see.

We'll break things down into three categories. The first is integrated graphics, which interestingly has gone entirely on-package and even on-die over the past year. It's surprising how fast that change really occurred. Coupled with NVIDIA's exit from the chipset business, we're strictly looking at Intel and AMD here. The second and third are dedicated to AMD and NVIDIA's mobile lines. Wherever possible we'll also link you to a review that demonstrates the performance of the graphics hardware in question. And note that when we talk about the number of shaders, CUDA cores, or EUs on a given part, that these numbers are ONLY comparable to other parts from the same vendor; 92 of NVIDIA's CUDA cores are not comparable to, say, 160 shaders from an AMD Radeon.

Integrated Graphics

"Too Slow to Play" Class: Intel HD Graphics (Arrandale), Intel Atom IGP, AMD Radeon HD 4250
Specs aren't provided because in this case they aren't really needed: none of these integrated graphics parts are going to be good for much more than the odd game of Unreal Tournament 2004. Intel has had a devil of a time getting their IGP act together prior to the advent of Sandy Bridge, while AMD's Radeon HD 3000/3100/3200/4200/4225/4250 core (yes, it's all basically the same core) is really showing its age. Thankfully, outside of Atom's IGP, all of these are on their way out. As for gaming on Atom, there's always the original StarCraft.

Intel HD 3000 (Sandy Bridge)
12 EUs, Core Clock: Varies
With Sandy Bridge, Intel was able to produce an integrated graphics part able to rival AMD and NVIDIA's budget entries. In fact, in our own testing we found the HD 3000 able to largely keep up with AMD's dedicated Radeon HD 6450 and to a lesser extent the 6470, and NVIDIA's current mobile lineup generally doesn't extend that low (likely excepting the GT 520M and GT 520MX). That said, there are still some caveats to the HD 3000: while Intel's questionable driver quality is largely behind it, you may still experience the odd compatibility issue from time to time (when Sandy Bridge dropped, Fallout 3 had an issue), and more punishing games like Mafia II and Metro 2033 will be largely out of its reach. The clocks on the HD 3000 also vary greatly, with a starting clock of 650MHz for mainstream parts, 500MHz for low voltage parts, and just 350MHz for ultra low voltage parts. Turbo clocks get even weirder, ranging anywhere from 900MHz to 1.3GHz depending on the processor model. Still, it's nice to not have to roll your eyes anymore at the suggestion of doing some casual gaming on Intel's integrated hardware. (Sandy Bridge Review)

AMD Radeon HD 6250/6310 (Brazos)
80 Shaders, 8 TMUs, 4 ROPs, Core Clock: 280MHz (6250), 500MHz (6310)
In Brazos, AMD produced a workable netbook-level processor core and grafted last generation's Radeon HD 5450/5470 core onto it. The result is an integrated graphics processor with a decent amount of horsepower for low-end casual gaming, but in some cases it's going to be hamstrung by the comparatively slow Bobcat processor cores. That's perfectly fine, though, as Brazos is generally a more desirable alternative to Atom + NG-ION netbooks, offering more processor performance and vastly superior battery life. Just don't expect to do any but the most casual gaming on a Brazos-powered netbook. (HP dm1z Review)

AMD Radeon HD 6380G/6480G/6520G/6620G (Llano)
160/240/320/400 (6380G/6480G/6520G/6620G) Shaders, 20/16/12 (6480G/6520G/6620G) TMUs, 8/4 (6620G and 6520G/6480G) ROPs, Core Clock: 400-444MHz
Llano isn't out anywhere near in force yet, but we have a good idea of how the 6620G performs and expect the IGP performance to essentially scale down in such a way that the model numbers are fairly appropriate. The long and short of Llano is that the processor half pales in comparison to Sandy Bridge, but the graphics hardware is monstrous. Gamers on an extreme budget are likely to be well-served by picking up a notebook with one of AMD's A6 or A8 processors in it, with Llano promising near-midrange mobile graphics performance. (Llano Mobile Review)

AMD Radeon HD 6000M Graphics
Comments Locked

85 Comments

View All Comments

  • Pirks - Wednesday, July 6, 2011 - link

    Some idiots from Adobe use ancient OpenGL shit instead of proper DX 10/11 APIs, who cares? Don't buy Adobe shit, buy shit that supports DX 10 or 11, that's the solution.

    And Minecraft is such an ugly POS I'm surprised it's not dead yet. Of course such an hideous ugliness would use OpenGL, why am I not surprised.
  • Drizzt321 - Wednesday, July 6, 2011 - link

    Ok, so, full of hate. Minecraft is not MEANT to look like a 2011 super new high res textured with all the bells and whistles and features and such that you get in the latest games. Part of it's charm (for many) is it's decidedly simple looks, simplistic seeming game play, and the world building you can do.

    Uh...and Adobe probably uses OpenGL since they also run on Mac, and are not intended to look or act like games do, but accelerate things that can more efficiently (and quickly) run on the GPU.

    P.S. I know, I shouldn't feed the trolls, but the Minecraft comment really got me with it's hatefulness.
  • Pirks - Wednesday, July 6, 2011 - link

    Simple is one thing and downright freaking hideously ugly is another, you know

    If I were Adobe I'd drop Mac support eons ago, it's such a pain in the butt to deal with ancient ugly OpenGL just 'cause Apple is incapable of using something better like DX 11 or something

    Anyway, I won't touch any mincecraft, adobe, opengl or any other shit like that with a 10 foot pole

    For simple looking games I'd go for proper stuff like MDK, very simple looking but very very far from hideous ugliness of minecraft cubistic shit
  • UMADBRO - Thursday, July 7, 2011 - link

    MDK a "proper" game?!?!?!?1?

    BWAHAHAHAHAHA

    ...

    wait, he's serious?

    ...

    BWAHAHAHAAAHHHHHAHAHA XD

    No seriously, just because you dont like it, doesnt mean all the vile shit you spewed about it is true. And honestly, the MC community will do just fine without the likes of you. LMAO!
  • Pirks - Thursday, July 7, 2011 - link

    Yeah, compared to hideous ugliness of Minecraft, MDK is proper one, simplistic but at the same time doesn't look like your poop
  • UMADBRO - Thursday, July 7, 2011 - link

    You're so full of it. I would say its funny, but actually, its rather sad and pathetic. You go play yurr 14 year old games and keep deluding yourself on whats "proper" or not...
  • Pirks - Friday, July 8, 2011 - link

    Beauty has no age, and same holds true for ugliness. Time will pass and MDK will stay simplistic and at the same time beautiful, while no time will fix cubistic cheap piece of shit look of Minecraft. I have no say in its gameplay, maybe for some people it's interesting (to me it's as boring as Sims and similar girly shit) but its graphics are worse than Digger and original ping pong from 1970. I dunno what could look more shoddy than that, among 3D stuff. Among 2D games there are more hideous titles for sure.
  • Penti - Thursday, July 7, 2011 - link

    Why are you trolling? Every professional video postpro, 3D modeling/animation and imaging app will use OpenGL. Also how could they use something Microsoft hasn't made public, standardized and licensed?

    OpenGL has nothing to do with the look of a game, you use the same tools and typically the same game engine regardless of rendering pipeline and API. The tools where you are crafting the 3D models are fully dependent on OpenGL any way. Not that it matters. And of course any mobile game is GLES.
  • Penti - Thursday, July 7, 2011 - link

    Also you can easily convert shaders between HLSL to GLSL, and you can also use Nvidia's Cg which compiles to either (and also works on some consoles). No really big problem there, all the stuff you need is supported in both API's.

    For simple games you could just go with something like Unity. For that matter consoles are pretty much limited to 2004 era D3D9 graphics. Newest game on OpenGL (on Windows) probably is Brink. Which doesn't look too bad. Performance is still there so it works out. It's here to stay.
  • Pirks - Thursday, July 7, 2011 - link

    I just want Windows software to use proper APIs for Windows, not some legacy Unix shit, that's all

Log in

Don't have an account? Sign up now