AMD Radeon HD Mobile Graphics Introduction

While on the desktop, AMD's Radeon HD cards are extremely competitive, the notebook space is far more complicated. Mobile Radeons and GeForces are both fairly common with neither one owning the market more aggressively than the other; this is actually an unusual equilibrium, as each generation of notebooks has typically seen a massive swing favoring one vendor over the other.

So what is AMD offering that you can't find on NVIDIA hardware? Arguably superior anti-aliasing and image quality, typically slightly higher performance than competing mobile parts, and support for Eyefinity. You'll find GDDR5 more frequently employed with AMD's chips to help mitigate narrow memory bus widths, too.

The essential problem with Radeons right now is that outside of Eyefinity they're still playing catch-up with NVIDIA's mobile solutions. Performance may be excellent in some cases, but NVIDIA leverages Optimus across their 500M line while support for switchable graphics in the Radeon HD 6000M series is spotty. NVIDIA's Verde mobile graphics driver initiative is also very mature, while support for AMD's mobile graphics driver across vendors is again spotty. That last point isn't entirely AMD's fault: vendors like Toshiba and Sony inexplicably opt out of the program despite the drivers working fine on their hardware. Finally, there are going to be niche cases where NVIDIA's support for CUDA and PhysX are relevant. OpenCL may eventually become the standard, but professional grade applications like Adobe Premiere Pro CS5 and CS5.5 can get a substantial boost from NVIDIA kit (provided you hack the "validated GPU" list to include yours.)

There's one more comparatively problem with AMD's lineup: while NVIDIA took their 500M series (largely an exercise in rebranding) as an opportunity to do some housekeeping, AMD basically integrated the entire Mobility Radeon HD 5000 line into the 6000Ms. Feature-wise this isn't a major issue, but it results in an incredibly bloated mobile lineup, with mobile chips from the Evergreen line occupying the same series as newer chips from the Northern Islands refresh.

AMD Radeon HD 6300M
80 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 500MHz (6330M/6350M) or 750MHz (6370M)
64-bit Memory Bus, DDR3, Effective Memory Clocks: 1.6GHz (6330M/6350M) or 1.8GHz (6350M/6370M)
Desktop Counterpart: Radeon HD 5450 (Cedar)

The 6300M series is the carryover/rebadging of the Mobility Radeon HD 5400 line. This is roughly the same graphics core as is integrated into Brazos, featuring a memory bus that's honestly just too narrow to really handle any serious gaming. With the advent of Sandy Bridge, it's also outclassed by Intel's integrated graphics hardware and as a result remains more of a solution for corner cases where an inexpensive dedicated graphics processor is needed. (No review available, but the Mobility Radeon HD 5470 in the Dell Studio 14 is comparable.)

AMD Radeon HD 6400M
160 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 480MHz-800MHz
64-bit Memory Bus, DDR3 or GDDR5 (6490M only), Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (GDDR5)
Desktop Counterpart: Radeon HD 6450 (Caicos)

Doubling the shader count of Cedar helps the mobile Caicos reach parity with Sandy Bridge's IGP in the 6430M and 6450M and then beat it with the 6470M and GDDR5-equipped 6490M. What the 6400M brings to the table is what AMD as a whole brings to the table compared to Intel's graphics: better game compatibility and Eyefinity multi-monitor support. Hardware with 64-bit memory buses should still be confined to running games at 1366x768, and heavier games are going to be off limits, but the 6400M series should satisfy more casual players. (Toshiba Tecra R850 for the HD 6450M; HP EliteBook 8460p for the HD 6470M.)

AMD Radeon HD 6500M
400 Shaders, 20 TMUs, 8 ROPs, Core Clocks: 500-650MHz
128-bit Memory Bus, DDR3 or GDDR5 (6570M only), Effective Memory Clocks: 1.8GHz (DDR3) or 3.6GHz (GDDR5)
Desktop Counterpart: Radeon HD 5570/5670 (Redwood)

AMD opted to employ a very close derivative of this core for Llano, and it should really be the minimum for gamers looking to play on a Radeon. A GDDR5-equipped model will go a long way towards improving performance at higher resolutions, but generally speaking the 6500M series will at least be fine for pushing settings at 1366x768 and most games at 1600x900. This is a rebadge of the Mobility Radeon 5600/5700 series. (No review available, but the Mobility Radeon HD 5650 in the Compal NBLB2 is comparable.)

AMD Radeon HD 6600M/6700M
480 Shaders, 24 TMUs, 8 ROPs, Core Clocks: 500-725MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (6630M/6730M) or 1.8GHz (6650M) or 3.6GHz (6750M/6770M)
Desktop Counterpart: Radeon HD 6570/6670 (Turks)

Bifurcating a single chip into two lines and then not even using the class of memory as a signifier is one of the more baffling decisions you'll find in this guide (though the prize has to go to NVIDIA's GT 555M), but AMD did the same thing with the 5600M/5700M series. GDDR5 is always going to be preferable to allow the graphics core to stretch its legs, but generally speaking this is a more minor, incremental improvement on its predecessor than Caicos was on Cedar, and the same rules for the 6500M apply here. (Look at the results for the 6630M in our Llano review.)

AMD Radeon HD 6800M
800 Shaders, 40 TMUs, 16 ROPs, Core Clocks: 575MHz-675MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (6850M GDDR5) or 4GHz (6870M)
Desktop Counterpart: Radeon HD 5770 (Juniper)

The astute reader is going to notice that, once again, AMD has rebranded their last generation, this time the 5800M series. While there are specs for DDR3-powered versions, the GDDR5-based ones are far more common in the wild. That's good, because the 128-bit memory bus is too anemic on its own to feed 800 of AMD's shader cores. Serious gamers are going to want to look at the 6800M as a minimum for gaming at 1080p. It's important to note that the 6800M is still going to be consistently slower than the desktop 5750 and 5770 due to substantially reduced core clocks (the desktop chips start at 700MHz). The 6870M is also just 25MHz slower than the Mobility Radeon HD 5870, so as I mentioned before, these are going to be a solid choice for gamers. (No review available, but the Mobility Radeon HD 5870 in the ASUS G73Jh is comparable.)

AMD Radeon HD 6900M
960 Shaders, 48 TMUs, 32 ROPs, Core Clocks: 580MHz (6950M) or 680MHz (6970M)
256-bit Memory Bus, GDDR5, Effective Memory Clocks: 3.6GHz
Desktop Counterpart: Radeon HD 6850 (Barts)

This is as powerful as it gets on the AMD side. The 6970M is going to be somewhat outclassed by the GTX 580M, but should tangle just fine with the 570M and thoroughly trounce anything slower. Likewise, you're apt to see these employed in a mobile Crossfire solution, leveraging the improvements in Crossfire scaling that AMD brought with the Barts core (along with the rest of Northern Islands.) While it'll never be as fast as a desktop 6850 due to the reduced core clocks, the 6900M series is an extremely potent mobile gaming solution. (Alienware M17x R3 Review)

Introduction and Integrated Graphics NVIDIA GeForce 500M Graphics
Comments Locked

85 Comments

View All Comments

  • ppeterka - Thursday, July 7, 2011 - link

    For most of the people portability is more than the distance from your couch to your kitchen. Try lugging that 10 pound beast with yourself on the underground, and try to fix up some slides in a PowerPoint, or try to fit it into the hand luggage when flying to a meeting.

    It might be new to you, and I risk ruining your optimistic world, but laptops are work equipment too. For quite some people... And as gaming notebooks are over-over-overpriced and then some, I find them useless unless someone is a traveling game hero... But there is a price in that case, and not only the pricetag, but several other crippling compromises must be made when going that route.

    For the price, you could get a decent Brazos based netbook to lug around, AND a fully fledged SLI/CF desktop. You're much better off with this, as I assume
    * you don't play interactive, 3d intensive games while cooking (which however Brazos would even support to a degree)
    * you won't plan on getting your gaming fix while underway

    Do you disagree with this?
  • rubbahbandman - Friday, July 8, 2011 - link

    I think you'd be surprised how affordable a good "gaming" laptop/desktop replacement is. I picked up the HP dv7-6143cl from Costco for only $875 along with a 2 yr warranty and it has some solid specs.

    2630qm, 8gb ram, 6770m, and you'd think with a 17.3" screen it would be heavy, but it weighs only 7lbs, less than a gallon of milk and that's in spite of the ridiculous 9 cell battery it has. (supposedly it can manage a 9.5 hr battery life).

    The native res is 1600x900 which isn't that special, but it works great for demanding games like Crysis 2. With the dx11 patch and high-res textures pack I can manage a solid 45-60fps, which is perfectly playable, and that pretty much sets the bar for challenging my system, so other than Crysis 2, I can crank up the res to my heart's content.
  • Mediamon - Sunday, July 10, 2011 - link

    Almost had me sold. Costco's HP configurator shows $1200 current price for rig with those same specs. A $325 difference. You must know someone at costco or playing a joke.
  • chinedooo - Monday, July 11, 2011 - link

    i would suggest hp's dv6t6100 straight from their website. $1025 with tax for a i7 2630, 15.6 in 1080p screen, 9 cell battery and hd 6770. this is after getting $450 off with a coupon which is pretty much always available. the thing weighs like 5.5 lbs. It really is a great laptop.
  • scook9 - Thursday, July 7, 2011 - link

    I got the 2920xm because it can overclock (at all) and has considerably better turbo options.

    We know you think it is idiotic, most people do. Because they either a) cant appreciate mobility AND power or b) cant afford it

    I never tried to argue that my M18x was a top value proposition ;) simply that bang for the buck is not there for the GTX 580m's vs the 6970m's

    A 12 pound laptop is about as powerful as a 50 pound desktop. Additionally, it already has the UPS, screen, mouse, keyboard built in (adding to value mind you). If you cannot handle moving a 12 pound laptop, you are just pathetic. End of story. The thing does not have to be a damned frisbee, but it is plenty portable. I have traveled all over the country with high end (large) laptops, it is perfectly doable.

    And as for your remark about being inferior to a desktop, I can share some benches if you still feel that way.

    Here is one, it can play Crysis MAXED out with all settings very high and max AA at 60 FPS on the native resolution. Don't spout off shit you have zero experience with, makes you look like the child you are.

    SO, at the end of your rant the only real complain I can come up with is price - yes I could have spent that $4000 on a desktop but I did not want to. Because I like being able to take my entire system with me wherever I go without having to think twice about it. My desktop is about 65 pounds by itself - THAT is not portable, a laptop (even if it weighs 20 pounds) is always portable.
  • jensend - Wednesday, July 6, 2011 - link

    You say "it's even a little difficult to recommend spending up for a notebook with anything less than a GeForce GT 540M or Radeon HD 6500M/6600M/6700M unless you really need the faster CPU on top of it." - but considering the pricing, the power consumption disadvantage, and Llano's strong performance I don't see why you'd go with a discrete AMD chip less powerful than Turks+gddr5. Why would you go for a (equal shader count) 6500M? Sure, there's more memory bandwidth, but you're sacrificing a good bit of wattage for not a heck of a lot of performance.
  • khimera2000 - Wednesday, July 6, 2011 - link

    My issue with this artticle is the touting of optimus, but the programe isint even supported that well. My notebook hasent seen a driver upgrade in the last 6 months. AMD might not have the dual graphics out all over, but you can bet that it will be better supported once all the bugs are nocked out.

    as it stands having intel and Nvidia play nice is really starting to chap my ass, and is becoming a fast reason to dump the intel Nvidia headach, and go for a pure amd build (once the drivers are mature enough of course).

    intel based optimus is broken, I wouldent outline the feature so much its missleading.

    agree with the rest though :)
  • RussianSensation - Wednesday, July 6, 2011 - link

    It would have been even more helpful if you guys included some benchmarks with the GPUs segregated into Mainstream and Performance category. I have a feeling the 6970M is the "best bang for the buck" on the high-end for mobile GPUs. The fact that 6970M also lives in the slim iMac chassis likely suggest that it also runs a lot cooler and is more energy efficient/has better power consumption than the 570M/580M chips.

    I feel that current stagnation at 40nm process has pretty much leveled GPU progress in both the mobile and desktop space. I foresee a major performance increase, especially on the mobile side in 12 months from now when we begin to see 28nm GPUs enter the marketplace.
  • Imnotrichey - Wednesday, July 6, 2011 - link

    I have never owned a laptop, but I have always wondered how will these things do if you have a home base set up at home (external monitor, external keyboard, mouse, etc.) for hardcore gaming but still want the portability of a laptop for work/school use.

    If plugged in, will these things be able to handle playing games on a 24 inch at 1900x1200? I am guessing not the latest graphically intense games (Crysis 2 for example) but what about like TF2, WOW, L4D and slightly older games like those?

    How much would you need to spend to handle gaming on an external monitor of that size? Sorry if this is a noob question, but thats always been my goal with a laptop but have never pulled the trigger. Might have to with grad school coming up soon.
  • randomusername3242 - Wednesday, July 6, 2011 - link

    For games such as WoW, TF2, L4D it is definitely possible. 1920 x 1080 at max settings is something a mid-tier mobile card could realistically do.

    For Crysis etc. you *can* make it work but it makes no sense. Like I posted above, you will overpay by $500-$1000 at least and the laptop will not even be portable in the end. It will be as portable as a concrete brick that weighs 10 lbs.

Log in

Don't have an account? Sign up now