Recommendations and Conclusion

So now that we have the nitty-gritty out of the way, how do we break things down? If you're looking strictly at pure performance, parts from either AMD or NVIDIA are going to be suitable for you (budget notwithstanding.) In the interests of fairness we'll include Intel in the pro and con conversation.

First, Intel has the best dedicated video encoding hardware on the market. AMD and NVIDIA both offer solutions that allow you to harness their shaders to substantially accelerate video encoding, but Intel's Quick Sync is best of breed (behind pure CPU-based encoding), offering a healthy improvement in encoding speed while producing the best output short of doing encoding on the CPU itself. It's worth noting, though, that NVIDIA solutions and AMD ones supporting switchable graphics can take advantage of Quick Sync, so you don't necessarily have to tie yourself down to Intel to benefit from it.

If you take video encoding out of the equation, unfortunately AMD isn't quite as strong in terms of feature offerings, boiling down to arguably slightly better image quality and support for Eyefinity (provided the notebook has a DisplayPort.) They do have a hybrid graphics solution similar to Optimus, but availability is spotty and you'll have to research the notebook model you're looking at to see if their switchable graphics are supported. NVIDIA's Optimus on the other hand is pervasive and mature, and their mobile graphics drivers are more widely supported than AMD's. 3D Vision, CUDA, and PhysX are much more niche, with AMD also offering 3D support and materializing in 3D-ready notebooks. If you have a need for CUDA or a desire for PhysX, your graphics vendor has been decided for you.

Knowing what each vendor offers, now we just have to know what to look for.

The netbook or ultraportable gamer is pretty much stuck with either buying a netbook with AMD's E-350 processor or paying through the nose for an Alienware M11x (spoiler alert: heavier than most "netbooks.") That's not a horrible thing as the E-350 has a capable graphics core, but even though the CPU side is faster than dual-core Atom it's still not quite enough to pick up the slack.

Gamers on an extreme budget used to be more or less screwed, but thankfully that's changed. Notebooks with AMD's A6 or A8 processors are going to be your one-stop shop, offering a tantalizing mix of middle-of-the-road CPU performance with remarkably fast integrated graphics hardware. There's a reason AMD refers to the A6 and A8 graphics hardware as "discrete-class" and for once it's not just marketing jargon. If you want to game for under $600, this is the way to go. In fact, it's even a little difficult to recommend spending up for a notebook with anything less than a GeForce GT 540M or Radeon HD 6500M/6600M/6700M unless you really need the faster CPU on top of it. If gaming while on the battery is important to you, then you need to be looking for Llano.

Users looking for a more well-rounded notebook would probably be well served by the aforementioned GeForce GT 540M or Radeon HD 6500M/6600M. These will hang out between about $700 and a grand and notebooks using these chips are going to be fairly mainstream in form factor, so you won't be lugging a land monster around. Be forewarned, though, these GPUs are going to be inadequate for driving games at 1080p and may still struggle at 1600x900.

The serious gamer looking for an affordable machine should be gunning straight for notebooks with NVIDIA's GeForce GTX 560M. This, or AMD's Radeon HD 6800M, will be the bare minimum for gaming comfortably at 1080p, but honestly the GTX 560M is liable to be the sweet spot in offering the very best balance in form factor favoring performance before you start getting into the huge, heavy desktop replacement notebooks.

Finally, for those who money is no object to, just about anything from the Radeon HD 6900M series or the GTX 570M or 580M is going to do the trick, and for the truly excessive users, an SLI or Crossfire notebook will yield dividends.

Update: Intel's engineers took umbrage with our suggestion that Intel's integrated graphics driver quality is still poor, and they were right to do so. While older graphics architectures may still be a bit fraught, Sandy Bridge is an overwhelming improvement. This guide has been updated to reflect that fact.

NVIDIA GeForce 500M Graphics
Comments Locked

85 Comments

View All Comments

  • prdola0 - Thursday, July 7, 2011 - link

    OpenGL is a "legacy" stuff? You must be living in a basement locked in a corner. Have you heard of OpenGL 4.1, which is equal or even better than DirectX 11? No. You just troll around. Unlike other trolls, your posts are not even clever or funny. Go back to DailyTech.
  • Pirks - Thursday, July 7, 2011 - link

    If OpenGL was all that unicorny and shit butterflies like you imply, then game devs would use it instead of DX. Alas, looks like you have no clue.
  • bhassel - Thursday, July 7, 2011 - link

    Game devs do use it. Ever seen a PS3 game?

    And yeah, DX is a cleaner API from a developer point of view, but that says nothing about the quality of the graphics it produces (same as OpenGL.) If anything, as more devices move away from windows, *DX* will be the legacy stuff.
  • Pirks - Thursday, July 7, 2011 - link

    Yeah, the dead PS3 of the dead Sony, welcome to the dead PSN guys! Gee what an argument, such a shitty console. This only proves my point that only shit developers on shit consoles use legacy OpenGL. If you code for Sony you must be crazy fucked up masochistic pervert enjoying pain in the ass that Sony gives you. Just read any interviews with Sony hardware using devs. I used to develop for PS3 a few years ago and you really have to look around thoroughly to find as stupid, stinking, developer unfriendly and moronic set of tools as Sony puked out for its console. Anything shitty MS ever did looks totally angelic and unicorny compared to poop Sony feeds its devs. No wonder there are some crap games on PS3 and the best stuff is on DX console from MS. Devs are smart and they like the best dev tools and the best APIs and currently no one even comes close to MS in that regard. So Sony using morons can stuff legacy OpenGL in where it belongs, ya know what I'm talking about eh :)))
  • Broheim - Friday, August 5, 2011 - link

    I'm gonna call BS on you being a developer, your complete fucking ignorance about openGL would be completely inexcusable for someone who supposedly "developed for PS3"...
  • leexgx - Saturday, July 9, 2011 - link

    so many comments on here its now stacking

    OpenGL is good in some ways as most OpenGL games should work under Linux

    but most games that use OpenGL seems like as any game thats made using OpenGL (the feel of the games seems the same)
  • UMADBRO - Thursday, July 7, 2011 - link

    I think you're the one that is clueless
  • Pirks - Friday, July 8, 2011 - link

    Tell that to game devs who use DX everywhere instead of legacy OpenGL shit on a shoddy Sony console that is as dead as PSN itself
  • Etern205 - Saturday, July 9, 2011 - link

    Pirks kicking Apple to the curb? o_0
    Hell has freeze and pigs do fly! :P
  • Broheim - Friday, August 5, 2011 - link

    adobe haven't "dropped Mac support eaons ago" because that's where they make their money.

    how do you propose Apple implements a proprietary microsoft technology, that microsoft have no intention of sharing, into their OS?
    also if openGL is so inferior why haven't Apple just written their own API like they did with openCL?

    the fact that minecraft doesn't try to look good on purpose seems to elude your simpleton brain, but that hardly comes as a surprise, minecraft is about the freedom and gameplay, it's not just shovelware with purty textures.

Log in

Don't have an account? Sign up now