AMD Radeon HD Mobile Graphics Introduction

While on the desktop, AMD's Radeon HD cards are extremely competitive, the notebook space is far more complicated. Mobile Radeons and GeForces are both fairly common with neither one owning the market more aggressively than the other; this is actually an unusual equilibrium, as each generation of notebooks has typically seen a massive swing favoring one vendor over the other.

So what is AMD offering that you can't find on NVIDIA hardware? Arguably superior anti-aliasing and image quality, typically slightly higher performance than competing mobile parts, and support for Eyefinity. You'll find GDDR5 more frequently employed with AMD's chips to help mitigate narrow memory bus widths, too.

The essential problem with Radeons right now is that outside of Eyefinity they're still playing catch-up with NVIDIA's mobile solutions. Performance may be excellent in some cases, but NVIDIA leverages Optimus across their 500M line while support for switchable graphics in the Radeon HD 6000M series is spotty. NVIDIA's Verde mobile graphics driver initiative is also very mature, while support for AMD's mobile graphics driver across vendors is again spotty. That last point isn't entirely AMD's fault: vendors like Toshiba and Sony inexplicably opt out of the program despite the drivers working fine on their hardware. Finally, there are going to be niche cases where NVIDIA's support for CUDA and PhysX are relevant. OpenCL may eventually become the standard, but professional grade applications like Adobe Premiere Pro CS5 and CS5.5 can get a substantial boost from NVIDIA kit (provided you hack the "validated GPU" list to include yours.)

There's one more comparatively problem with AMD's lineup: while NVIDIA took their 500M series (largely an exercise in rebranding) as an opportunity to do some housekeeping, AMD basically integrated the entire Mobility Radeon HD 5000 line into the 6000Ms. Feature-wise this isn't a major issue, but it results in an incredibly bloated mobile lineup, with mobile chips from the Evergreen line occupying the same series as newer chips from the Northern Islands refresh.

AMD Radeon HD 6300M
80 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 500MHz (6330M/6350M) or 750MHz (6370M)
64-bit Memory Bus, DDR3, Effective Memory Clocks: 1.6GHz (6330M/6350M) or 1.8GHz (6350M/6370M)
Desktop Counterpart: Radeon HD 5450 (Cedar)

The 6300M series is the carryover/rebadging of the Mobility Radeon HD 5400 line. This is roughly the same graphics core as is integrated into Brazos, featuring a memory bus that's honestly just too narrow to really handle any serious gaming. With the advent of Sandy Bridge, it's also outclassed by Intel's integrated graphics hardware and as a result remains more of a solution for corner cases where an inexpensive dedicated graphics processor is needed. (No review available, but the Mobility Radeon HD 5470 in the Dell Studio 14 is comparable.)

AMD Radeon HD 6400M
160 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 480MHz-800MHz
64-bit Memory Bus, DDR3 or GDDR5 (6490M only), Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (GDDR5)
Desktop Counterpart: Radeon HD 6450 (Caicos)

Doubling the shader count of Cedar helps the mobile Caicos reach parity with Sandy Bridge's IGP in the 6430M and 6450M and then beat it with the 6470M and GDDR5-equipped 6490M. What the 6400M brings to the table is what AMD as a whole brings to the table compared to Intel's graphics: better game compatibility and Eyefinity multi-monitor support. Hardware with 64-bit memory buses should still be confined to running games at 1366x768, and heavier games are going to be off limits, but the 6400M series should satisfy more casual players. (Toshiba Tecra R850 for the HD 6450M; HP EliteBook 8460p for the HD 6470M.)

AMD Radeon HD 6500M
400 Shaders, 20 TMUs, 8 ROPs, Core Clocks: 500-650MHz
128-bit Memory Bus, DDR3 or GDDR5 (6570M only), Effective Memory Clocks: 1.8GHz (DDR3) or 3.6GHz (GDDR5)
Desktop Counterpart: Radeon HD 5570/5670 (Redwood)

AMD opted to employ a very close derivative of this core for Llano, and it should really be the minimum for gamers looking to play on a Radeon. A GDDR5-equipped model will go a long way towards improving performance at higher resolutions, but generally speaking the 6500M series will at least be fine for pushing settings at 1366x768 and most games at 1600x900. This is a rebadge of the Mobility Radeon 5600/5700 series. (No review available, but the Mobility Radeon HD 5650 in the Compal NBLB2 is comparable.)

AMD Radeon HD 6600M/6700M
480 Shaders, 24 TMUs, 8 ROPs, Core Clocks: 500-725MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (6630M/6730M) or 1.8GHz (6650M) or 3.6GHz (6750M/6770M)
Desktop Counterpart: Radeon HD 6570/6670 (Turks)

Bifurcating a single chip into two lines and then not even using the class of memory as a signifier is one of the more baffling decisions you'll find in this guide (though the prize has to go to NVIDIA's GT 555M), but AMD did the same thing with the 5600M/5700M series. GDDR5 is always going to be preferable to allow the graphics core to stretch its legs, but generally speaking this is a more minor, incremental improvement on its predecessor than Caicos was on Cedar, and the same rules for the 6500M apply here. (Look at the results for the 6630M in our Llano review.)

AMD Radeon HD 6800M
800 Shaders, 40 TMUs, 16 ROPs, Core Clocks: 575MHz-675MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (6850M GDDR5) or 4GHz (6870M)
Desktop Counterpart: Radeon HD 5770 (Juniper)

The astute reader is going to notice that, once again, AMD has rebranded their last generation, this time the 5800M series. While there are specs for DDR3-powered versions, the GDDR5-based ones are far more common in the wild. That's good, because the 128-bit memory bus is too anemic on its own to feed 800 of AMD's shader cores. Serious gamers are going to want to look at the 6800M as a minimum for gaming at 1080p. It's important to note that the 6800M is still going to be consistently slower than the desktop 5750 and 5770 due to substantially reduced core clocks (the desktop chips start at 700MHz). The 6870M is also just 25MHz slower than the Mobility Radeon HD 5870, so as I mentioned before, these are going to be a solid choice for gamers. (No review available, but the Mobility Radeon HD 5870 in the ASUS G73Jh is comparable.)

AMD Radeon HD 6900M
960 Shaders, 48 TMUs, 32 ROPs, Core Clocks: 580MHz (6950M) or 680MHz (6970M)
256-bit Memory Bus, GDDR5, Effective Memory Clocks: 3.6GHz
Desktop Counterpart: Radeon HD 6850 (Barts)

This is as powerful as it gets on the AMD side. The 6970M is going to be somewhat outclassed by the GTX 580M, but should tangle just fine with the 570M and thoroughly trounce anything slower. Likewise, you're apt to see these employed in a mobile Crossfire solution, leveraging the improvements in Crossfire scaling that AMD brought with the Barts core (along with the rest of Northern Islands.) While it'll never be as fast as a desktop 6850 due to the reduced core clocks, the 6900M series is an extremely potent mobile gaming solution. (Alienware M17x R3 Review)

Introduction and Integrated Graphics NVIDIA GeForce 500M Graphics
Comments Locked

85 Comments

View All Comments

  • Pirks - Wednesday, July 6, 2011 - link

    Some idiots from Adobe use ancient OpenGL shit instead of proper DX 10/11 APIs, who cares? Don't buy Adobe shit, buy shit that supports DX 10 or 11, that's the solution.

    And Minecraft is such an ugly POS I'm surprised it's not dead yet. Of course such an hideous ugliness would use OpenGL, why am I not surprised.
  • Drizzt321 - Wednesday, July 6, 2011 - link

    Ok, so, full of hate. Minecraft is not MEANT to look like a 2011 super new high res textured with all the bells and whistles and features and such that you get in the latest games. Part of it's charm (for many) is it's decidedly simple looks, simplistic seeming game play, and the world building you can do.

    Uh...and Adobe probably uses OpenGL since they also run on Mac, and are not intended to look or act like games do, but accelerate things that can more efficiently (and quickly) run on the GPU.

    P.S. I know, I shouldn't feed the trolls, but the Minecraft comment really got me with it's hatefulness.
  • Pirks - Wednesday, July 6, 2011 - link

    Simple is one thing and downright freaking hideously ugly is another, you know

    If I were Adobe I'd drop Mac support eons ago, it's such a pain in the butt to deal with ancient ugly OpenGL just 'cause Apple is incapable of using something better like DX 11 or something

    Anyway, I won't touch any mincecraft, adobe, opengl or any other shit like that with a 10 foot pole

    For simple looking games I'd go for proper stuff like MDK, very simple looking but very very far from hideous ugliness of minecraft cubistic shit
  • UMADBRO - Thursday, July 7, 2011 - link

    MDK a "proper" game?!?!?!?1?

    BWAHAHAHAHAHA

    ...

    wait, he's serious?

    ...

    BWAHAHAHAAAHHHHHAHAHA XD

    No seriously, just because you dont like it, doesnt mean all the vile shit you spewed about it is true. And honestly, the MC community will do just fine without the likes of you. LMAO!
  • Pirks - Thursday, July 7, 2011 - link

    Yeah, compared to hideous ugliness of Minecraft, MDK is proper one, simplistic but at the same time doesn't look like your poop
  • UMADBRO - Thursday, July 7, 2011 - link

    You're so full of it. I would say its funny, but actually, its rather sad and pathetic. You go play yurr 14 year old games and keep deluding yourself on whats "proper" or not...
  • Pirks - Friday, July 8, 2011 - link

    Beauty has no age, and same holds true for ugliness. Time will pass and MDK will stay simplistic and at the same time beautiful, while no time will fix cubistic cheap piece of shit look of Minecraft. I have no say in its gameplay, maybe for some people it's interesting (to me it's as boring as Sims and similar girly shit) but its graphics are worse than Digger and original ping pong from 1970. I dunno what could look more shoddy than that, among 3D stuff. Among 2D games there are more hideous titles for sure.
  • Penti - Thursday, July 7, 2011 - link

    Why are you trolling? Every professional video postpro, 3D modeling/animation and imaging app will use OpenGL. Also how could they use something Microsoft hasn't made public, standardized and licensed?

    OpenGL has nothing to do with the look of a game, you use the same tools and typically the same game engine regardless of rendering pipeline and API. The tools where you are crafting the 3D models are fully dependent on OpenGL any way. Not that it matters. And of course any mobile game is GLES.
  • Penti - Thursday, July 7, 2011 - link

    Also you can easily convert shaders between HLSL to GLSL, and you can also use Nvidia's Cg which compiles to either (and also works on some consoles). No really big problem there, all the stuff you need is supported in both API's.

    For simple games you could just go with something like Unity. For that matter consoles are pretty much limited to 2004 era D3D9 graphics. Newest game on OpenGL (on Windows) probably is Brink. Which doesn't look too bad. Performance is still there so it works out. It's here to stay.
  • Pirks - Thursday, July 7, 2011 - link

    I just want Windows software to use proper APIs for Windows, not some legacy Unix shit, that's all

Log in

Don't have an account? Sign up now