POST A COMMENT

83 Comments

Back to Article

  • Iketh - Tuesday, July 05, 2011 - link

    GT555M "B" is an option in the Dell XPS line Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    This is true ^^ Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    The Dell XPS 17 ONLY btw Reply
  • zackyy - Wednesday, July 06, 2011 - link

    Only on the 17inch brick Reply
  • barmalej - Wednesday, July 06, 2011 - link

    there is GT 555M "B" with 128bit bus (Clevo Clevo W150HR), and also GTX 570M uses 192bit memory bus Reply
  • Dustin Sklavos - Wednesday, July 06, 2011 - link

    Ack, thank you, fixed it. Reply
  • marc1000 - Wednesday, July 06, 2011 - link

    ack?

    network terminology now? lol
    Reply
  • Meaker10 - Thursday, July 07, 2011 - link

    I don't see it fixed. Also the 144 shader part with a 128bit mem bus has 16 Rops rather than 24 and is far more common than the GDDR5 part and have been around in clevo and (more importantly) Acer machines (who are the largest notebook maker after all) for far longer. Reply
  • Althernai - Wednesday, July 06, 2011 - link

    Just a word of warning about AMD GPUs in the latest Sandy Bridge laptops: AMD has moved from their manual GPU switching to a muxless, automatic switchable graphics scheme similar to Optimus, except that it doesn't work nearly as well. In particular, OpenGL applications (MineCraft, much of Adobe's content creation suite, etc.) will always run on the integrated GPU, regardless of what the user tries to force them to do.

    They tried to pull this trick without telling anyone and now there is a lot of angry people who got a laptop with a graphics card that refuses to work for their purposes:
    http://h30434.www3.hp.com/t5/Notebook-Display-and-...
    http://en.community.dell.com/support-forums/laptop...
    http://forum.lenovo.com/t5/ThinkPad-Edge/ATI-GPU-d...

    It's really a pity too because the combination of the 6770M and Sandy Bridge with switchable graphics is the best out there if you need a decent CPU, good battery life and a powerful GPU, but the latter only works for DirectX.
    Reply
  • Wolfpup - Wednesday, July 06, 2011 - link

    Besides that, they can't use normal drivers on Intel CPUs either.

    I *HATE* all this switchable graphics stuff. As though it weren't a minor miracle this stuff worked at all, we're going to add all sorts of complexity to it?!?
    Reply
  • Pirks - Wednesday, July 06, 2011 - link

    Some idiots from Adobe use ancient OpenGL shit instead of proper DX 10/11 APIs, who cares? Don't buy Adobe shit, buy shit that supports DX 10 or 11, that's the solution.

    And Minecraft is such an ugly POS I'm surprised it's not dead yet. Of course such an hideous ugliness would use OpenGL, why am I not surprised.
    Reply
  • Drizzt321 - Wednesday, July 06, 2011 - link

    Ok, so, full of hate. Minecraft is not MEANT to look like a 2011 super new high res textured with all the bells and whistles and features and such that you get in the latest games. Part of it's charm (for many) is it's decidedly simple looks, simplistic seeming game play, and the world building you can do.

    Uh...and Adobe probably uses OpenGL since they also run on Mac, and are not intended to look or act like games do, but accelerate things that can more efficiently (and quickly) run on the GPU.

    P.S. I know, I shouldn't feed the trolls, but the Minecraft comment really got me with it's hatefulness.
    Reply
  • Pirks - Wednesday, July 06, 2011 - link

    Simple is one thing and downright freaking hideously ugly is another, you know

    If I were Adobe I'd drop Mac support eons ago, it's such a pain in the butt to deal with ancient ugly OpenGL just 'cause Apple is incapable of using something better like DX 11 or something

    Anyway, I won't touch any mincecraft, adobe, opengl or any other shit like that with a 10 foot pole

    For simple looking games I'd go for proper stuff like MDK, very simple looking but very very far from hideous ugliness of minecraft cubistic shit
    Reply
  • UMADBRO - Thursday, July 07, 2011 - link

    MDK a "proper" game?!?!?!?1?

    BWAHAHAHAHAHA

    ...

    wait, he's serious?

    ...

    BWAHAHAHAAAHHHHHAHAHA XD

    No seriously, just because you dont like it, doesnt mean all the vile shit you spewed about it is true. And honestly, the MC community will do just fine without the likes of you. LMAO!
    Reply
  • Pirks - Thursday, July 07, 2011 - link

    Yeah, compared to hideous ugliness of Minecraft, MDK is proper one, simplistic but at the same time doesn't look like your poop Reply
  • UMADBRO - Thursday, July 07, 2011 - link

    You're so full of it. I would say its funny, but actually, its rather sad and pathetic. You go play yurr 14 year old games and keep deluding yourself on whats "proper" or not... Reply
  • Pirks - Friday, July 08, 2011 - link

    Beauty has no age, and same holds true for ugliness. Time will pass and MDK will stay simplistic and at the same time beautiful, while no time will fix cubistic cheap piece of shit look of Minecraft. I have no say in its gameplay, maybe for some people it's interesting (to me it's as boring as Sims and similar girly shit) but its graphics are worse than Digger and original ping pong from 1970. I dunno what could look more shoddy than that, among 3D stuff. Among 2D games there are more hideous titles for sure. Reply
  • Penti - Thursday, July 07, 2011 - link

    Why are you trolling? Every professional video postpro, 3D modeling/animation and imaging app will use OpenGL. Also how could they use something Microsoft hasn't made public, standardized and licensed?

    OpenGL has nothing to do with the look of a game, you use the same tools and typically the same game engine regardless of rendering pipeline and API. The tools where you are crafting the 3D models are fully dependent on OpenGL any way. Not that it matters. And of course any mobile game is GLES.
    Reply
  • Penti - Thursday, July 07, 2011 - link

    Also you can easily convert shaders between HLSL to GLSL, and you can also use Nvidia's Cg which compiles to either (and also works on some consoles). No really big problem there, all the stuff you need is supported in both API's.

    For simple games you could just go with something like Unity. For that matter consoles are pretty much limited to 2004 era D3D9 graphics. Newest game on OpenGL (on Windows) probably is Brink. Which doesn't look too bad. Performance is still there so it works out. It's here to stay.
    Reply
  • Pirks - Thursday, July 07, 2011 - link

    I just want Windows software to use proper APIs for Windows, not some legacy Unix shit, that's all Reply
  • prdola0 - Thursday, July 07, 2011 - link

    OpenGL is a "legacy" stuff? You must be living in a basement locked in a corner. Have you heard of OpenGL 4.1, which is equal or even better than DirectX 11? No. You just troll around. Unlike other trolls, your posts are not even clever or funny. Go back to DailyTech. Reply
  • Pirks - Thursday, July 07, 2011 - link

    If OpenGL was all that unicorny and shit butterflies like you imply, then game devs would use it instead of DX. Alas, looks like you have no clue. Reply
  • bhassel - Thursday, July 07, 2011 - link

    Game devs do use it. Ever seen a PS3 game?

    And yeah, DX is a cleaner API from a developer point of view, but that says nothing about the quality of the graphics it produces (same as OpenGL.) If anything, as more devices move away from windows, *DX* will be the legacy stuff.
    Reply
  • Pirks - Thursday, July 07, 2011 - link

    Yeah, the dead PS3 of the dead Sony, welcome to the dead PSN guys! Gee what an argument, such a shitty console. This only proves my point that only shit developers on shit consoles use legacy OpenGL. If you code for Sony you must be crazy fucked up masochistic pervert enjoying pain in the ass that Sony gives you. Just read any interviews with Sony hardware using devs. I used to develop for PS3 a few years ago and you really have to look around thoroughly to find as stupid, stinking, developer unfriendly and moronic set of tools as Sony puked out for its console. Anything shitty MS ever did looks totally angelic and unicorny compared to poop Sony feeds its devs. No wonder there are some crap games on PS3 and the best stuff is on DX console from MS. Devs are smart and they like the best dev tools and the best APIs and currently no one even comes close to MS in that regard. So Sony using morons can stuff legacy OpenGL in where it belongs, ya know what I'm talking about eh :))) Reply
  • Broheim - Friday, August 05, 2011 - link

    I'm gonna call BS on you being a developer, your complete fucking ignorance about openGL would be completely inexcusable for someone who supposedly "developed for PS3"... Reply
  • leexgx - Saturday, July 09, 2011 - link

    so many comments on here its now stacking

    OpenGL is good in some ways as most OpenGL games should work under Linux

    but most games that use OpenGL seems like as any game thats made using OpenGL (the feel of the games seems the same)
    Reply
  • UMADBRO - Thursday, July 07, 2011 - link

    I think you're the one that is clueless Reply
  • Pirks - Friday, July 08, 2011 - link

    Tell that to game devs who use DX everywhere instead of legacy OpenGL shit on a shoddy Sony console that is as dead as PSN itself Reply
  • Etern205 - Saturday, July 09, 2011 - link

    Pirks kicking Apple to the curb? o_0
    Hell has freeze and pigs do fly! :P
    Reply
  • Broheim - Friday, August 05, 2011 - link

    adobe haven't "dropped Mac support eaons ago" because that's where they make their money.

    how do you propose Apple implements a proprietary microsoft technology, that microsoft have no intention of sharing, into their OS?
    also if openGL is so inferior why haven't Apple just written their own API like they did with openCL?

    the fact that minecraft doesn't try to look good on purpose seems to elude your simpleton brain, but that hardly comes as a surprise, minecraft is about the freedom and gameplay, it's not just shovelware with purty textures.
    Reply
  • samspqr - Wednesday, July 06, 2011 - link

    "First, Intel has the best dedicated video encoding hardware on the market"

    no, it doesn't; it has the fastest one, but quality is very poor, so nobody should use it except for delivering content to a smartphone for watching while commuting

    until they solve those quality issues, it is a worthless feature

    (or maybe they already solved them and I didn't get the news: please report if that is so)
    Reply
  • Dustin Sklavos - Wednesday, July 06, 2011 - link

    In every comparison I've read, Quick Sync produces the best encoded quality short of raw CPU-only encoding. It's my understanding that in order of quality, Intel's encoder is best, followed by AMD's and then NVIDIA's in dead last. Intel's just also happens to be the fastest.

    That said, I work in video on the weekends and as a matter of principle won't do anything but CPU-based encodes, so I could really care either way. ;)
    Reply
  • mino - Thursday, July 07, 2011 - link

    1) CPU-based encodes are _indifferent_ to CPU used even to CPU ARCHITECtuRE used.

    2) That intel has the best _accelerated_ encode _for_specific_use_cases_ is FAR from "has the best encode".

    Please do some research on terminology next time.
    Reply
  • Darkstone - Wednesday, July 06, 2011 - link

    There is yet another revision of the GT550m. It is identical tot GT 555m "B" except it has an 128-bit memory bus. Confused yet?
    http://forum.notebookreview.com/dell-xps-studio-xp...
    It is only found in the xps 17 3D.
    Reply
  • Darkstone - Wednesday, July 06, 2011 - link

    Remove that "3D" part Reply
  • Darkstone - Wednesday, July 06, 2011 - link

    There is more: GT555m with 144 shaders, GDDR3, and 128bit-bus:
    http://i56.tinypic.com/2nh0tnd.jpg
    Higher clocked version of above GT 550m... Screenshot from clevo 15" notebook (BTO 15CL35)
    Reply
  • Dustin Sklavos - Wednesday, July 06, 2011 - link

    Ugh, seriously?

    This is insane. I'm impressed. AMD rebadges an entire generation yet NVIDIA somehow still manages to win the "Idiotic Branding" award.
    Reply
  • silverblue - Wednesday, July 06, 2011 - link

    If at first you don't succeed... attempt to confuse your consumers as much as humanly possible. ;) Reply
  • kasakka - Wednesday, July 06, 2011 - link

    Whoever is making up these brand names needs to be shot. They're unnecessarily confusing and the whole rebranding old crap thing really needs to stop.

    It would be nice to see some sort of comparison of the current mobile chips to the older ones like the 9400M. I haven't bought a new laptop for several years and have no idea how fast the current chips truly are compared to the older stuff. Probably hard to do because you can't just get the same configuration with different GPUs.
    Reply
  • Wieland - Wednesday, July 06, 2011 - link

    Have you seen this?

    http://www.notebookcheck.net/Mobile-Graphics-Cards...

    And, for good measure, this:

    http://www.notebookcheck.net/Mobile-Processors-Ben...
    Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    If anyone is buying a laptop, the best deal you can get is the HP Dv6t or dv7t. 6770m, 2630qm processor, matte 1080p screen, you can't beat it and it's under 1000 or barely over. Yes there is a graphics switching problem but it should not affect a typical gamer.

    The Dell XPS 17 is comparable but costs more. Other good choices are ASUS G53/G73, and MSI Force 16F2 for those with bigger budgets who do not care if your laptop looks ugly and is bulky.
    Reply
  • anotherfakeaccount - Wednesday, July 06, 2011 - link

    "This, or AMD's Radeon HD 6800M, will be the bare minimum for gaming comfortably at 1080p, but honestly the GTX 560M is liable to be the sweet spot in offering the very best balance in form factor favoring performance before you start getting into the huge, heavy desktop replacement notebooks."

    The GTX 560m can hardly be called portable. A 6850m can be put in a laptop with comparable size. And neither laptop is truly portable.
    Reply
  • Stuka87 - Wednesday, July 06, 2011 - link

    I don't see any mention of the Quadro series of chips? I realize they are somewhat a duplicate of consumer series chips, but they are probably worth a mention. Reply
  • DanNeely - Wednesday, July 06, 2011 - link

    Adding another level of WTF to what's already in the article would cause the servers to explode. Reply
  • Drizzt321 - Wednesday, July 06, 2011 - link

    Heh, yea, I was just asking about that. I have a Lenovo w520 on the way with the 1000m. Reply
  • Arbie - Wednesday, July 06, 2011 - link

    I think you hit the target - pulling together a lot of hard-to-find info and boiling down the choices. This is exactly what I need to even get started on choosing a game-capable laptop / netbook. Thanks. Reply
  • MrTeal - Wednesday, July 06, 2011 - link

    I know that you can't buy these chips yourself, and that OEMs might be able to work out better deals than the list price, but it would be interesting to know what each GPU is listed at in 1000 unit quantities, just to get an idea of the relative cost between them. Reply
  • scook9 - Wednesday, July 06, 2011 - link

    Price is EXTREMELY relevant here. Something that cannot be ignored. Reason being that nvidia prices are out of this world high compared to ATI and that pushes my hand rather often

    I am painfully knowledgeable on notebook hardware (over 10k posts on notebookreview forums under the same username) so I like to think I have some credibility

    When wondering why price matters....just look at the pricing on graphics options for the Alienware M18x (bare in mind these are pricing for 2 cards not 1 but shows the differences)
    -Upgrade from stock to CF 6970m $400
    -Upgrade from stock to SLI GTX 580m $1200

    That is WAY to big of a difference for the spread in performance (5-10% real world?). I know that I have the CF 6970m's (GTX 580m's were not available when I ordered mine so was a very easy choice) with a 2920xm and that laptop screams. And for the gaming laptop haters out there....I get 4.5 hours battery life on the HD 3000 :D
    Reply
  • randomusername3242 - Wednesday, July 06, 2011 - link

    So you're complaining about prices for upgrades when you bought a 2920xm which you probably paid an exorbitantly high price for? I wouldn't be surprised if you paid over 400 to upgrade from a 2630qm for that.

    I think it's idiotic to buy any high end mobile part, GTX 580m or 2920xm.

    There's a sweet spot in price/performance. It's with the 2630qm + GTX 460m (maybe the 2720qm + 560m). Go any higher and you're throwing money, go any lower and you don't get enough performance.

    And I'll bite. I think it's also dumb to buy a gaming laptop because even if you get 4.5 hours battery life, with the specs thay you say you have your laptop is not portable at all. Sure, you might not have a tower and many wires, but you're overpaying for a big and often ugly piece of metal that will not move around. (You really think you can move around 10 lbs?)

    And how much did you pay? You don't get 2920xm + crossifre 6970ms for less than 2000.

    I'll make a distinction between a gaming laptop and a desktop replacement. Gaming laptops are feasible, sometimes affordable, and moderately portable. Desktop replacements are not portable, not affordable, and considerably inferior to a desktop.
    Reply
  • seapeople - Wednesday, July 06, 2011 - link

    Wow, you sound somewhat disillusioned. There are millions of people out there spending significantly more money on things they don't need that don't even give them performance benefits (such as a city-slicker buying an F150 or Cadillac SUV, or Joe Smoe spending $3000/yr just so he can get his daily Starbucks coffee).

    In fact, if you are the type of person who can afford such luxury items, spending an extra $500 so your processor can turbo 20% higher and not slow you down wouldn't even register on your radar as being excessive, and rightfully so.

    Finally, you and so many others are completely wrong on the portability of big laptops. I like to watch movies or tv shows while, say, cooking dinner. Picking up a 10 pound laptop and bringing it to the kitchen with me is not even difficult in the slightest, whereas even the smallest portable desktop would require a 10 minute shutdown, transfer, and setup time.
    Reply
  • ppeterka - Thursday, July 07, 2011 - link

    For most of the people portability is more than the distance from your couch to your kitchen. Try lugging that 10 pound beast with yourself on the underground, and try to fix up some slides in a PowerPoint, or try to fit it into the hand luggage when flying to a meeting.

    It might be new to you, and I risk ruining your optimistic world, but laptops are work equipment too. For quite some people... And as gaming notebooks are over-over-overpriced and then some, I find them useless unless someone is a traveling game hero... But there is a price in that case, and not only the pricetag, but several other crippling compromises must be made when going that route.

    For the price, you could get a decent Brazos based netbook to lug around, AND a fully fledged SLI/CF desktop. You're much better off with this, as I assume
    * you don't play interactive, 3d intensive games while cooking (which however Brazos would even support to a degree)
    * you won't plan on getting your gaming fix while underway

    Do you disagree with this?
    Reply
  • rubbahbandman - Friday, July 08, 2011 - link

    I think you'd be surprised how affordable a good "gaming" laptop/desktop replacement is. I picked up the HP dv7-6143cl from Costco for only $875 along with a 2 yr warranty and it has some solid specs.

    2630qm, 8gb ram, 6770m, and you'd think with a 17.3" screen it would be heavy, but it weighs only 7lbs, less than a gallon of milk and that's in spite of the ridiculous 9 cell battery it has. (supposedly it can manage a 9.5 hr battery life).

    The native res is 1600x900 which isn't that special, but it works great for demanding games like Crysis 2. With the dx11 patch and high-res textures pack I can manage a solid 45-60fps, which is perfectly playable, and that pretty much sets the bar for challenging my system, so other than Crysis 2, I can crank up the res to my heart's content.
    Reply
  • Mediamon - Sunday, July 10, 2011 - link

    Almost had me sold. Costco's HP configurator shows $1200 current price for rig with those same specs. A $325 difference. You must know someone at costco or playing a joke. Reply
  • chinedooo - Monday, July 11, 2011 - link

    i would suggest hp's dv6t6100 straight from their website. $1025 with tax for a i7 2630, 15.6 in 1080p screen, 9 cell battery and hd 6770. this is after getting $450 off with a coupon which is pretty much always available. the thing weighs like 5.5 lbs. It really is a great laptop. Reply
  • scook9 - Thursday, July 07, 2011 - link

    I got the 2920xm because it can overclock (at all) and has considerably better turbo options.

    We know you think it is idiotic, most people do. Because they either a) cant appreciate mobility AND power or b) cant afford it

    I never tried to argue that my M18x was a top value proposition ;) simply that bang for the buck is not there for the GTX 580m's vs the 6970m's

    A 12 pound laptop is about as powerful as a 50 pound desktop. Additionally, it already has the UPS, screen, mouse, keyboard built in (adding to value mind you). If you cannot handle moving a 12 pound laptop, you are just pathetic. End of story. The thing does not have to be a damned frisbee, but it is plenty portable. I have traveled all over the country with high end (large) laptops, it is perfectly doable.

    And as for your remark about being inferior to a desktop, I can share some benches if you still feel that way.

    Here is one, it can play Crysis MAXED out with all settings very high and max AA at 60 FPS on the native resolution. Don't spout off shit you have zero experience with, makes you look like the child you are.

    SO, at the end of your rant the only real complain I can come up with is price - yes I could have spent that $4000 on a desktop but I did not want to. Because I like being able to take my entire system with me wherever I go without having to think twice about it. My desktop is about 65 pounds by itself - THAT is not portable, a laptop (even if it weighs 20 pounds) is always portable.
    Reply
  • jensend - Wednesday, July 06, 2011 - link

    You say "it's even a little difficult to recommend spending up for a notebook with anything less than a GeForce GT 540M or Radeon HD 6500M/6600M/6700M unless you really need the faster CPU on top of it." - but considering the pricing, the power consumption disadvantage, and Llano's strong performance I don't see why you'd go with a discrete AMD chip less powerful than Turks+gddr5. Why would you go for a (equal shader count) 6500M? Sure, there's more memory bandwidth, but you're sacrificing a good bit of wattage for not a heck of a lot of performance. Reply
  • khimera2000 - Wednesday, July 06, 2011 - link

    My issue with this artticle is the touting of optimus, but the programe isint even supported that well. My notebook hasent seen a driver upgrade in the last 6 months. AMD might not have the dual graphics out all over, but you can bet that it will be better supported once all the bugs are nocked out.

    as it stands having intel and Nvidia play nice is really starting to chap my ass, and is becoming a fast reason to dump the intel Nvidia headach, and go for a pure amd build (once the drivers are mature enough of course).

    intel based optimus is broken, I wouldent outline the feature so much its missleading.

    agree with the rest though :)
    Reply
  • RussianSensation - Wednesday, July 06, 2011 - link

    It would have been even more helpful if you guys included some benchmarks with the GPUs segregated into Mainstream and Performance category. I have a feeling the 6970M is the "best bang for the buck" on the high-end for mobile GPUs. The fact that 6970M also lives in the slim iMac chassis likely suggest that it also runs a lot cooler and is more energy efficient/has better power consumption than the 570M/580M chips.

    I feel that current stagnation at 40nm process has pretty much leveled GPU progress in both the mobile and desktop space. I foresee a major performance increase, especially on the mobile side in 12 months from now when we begin to see 28nm GPUs enter the marketplace.
    Reply
  • Imnotrichey - Wednesday, July 06, 2011 - link

    I have never owned a laptop, but I have always wondered how will these things do if you have a home base set up at home (external monitor, external keyboard, mouse, etc.) for hardcore gaming but still want the portability of a laptop for work/school use.

    If plugged in, will these things be able to handle playing games on a 24 inch at 1900x1200? I am guessing not the latest graphically intense games (Crysis 2 for example) but what about like TF2, WOW, L4D and slightly older games like those?

    How much would you need to spend to handle gaming on an external monitor of that size? Sorry if this is a noob question, but thats always been my goal with a laptop but have never pulled the trigger. Might have to with grad school coming up soon.
    Reply
  • randomusername3242 - Wednesday, July 06, 2011 - link

    For games such as WoW, TF2, L4D it is definitely possible. 1920 x 1080 at max settings is something a mid-tier mobile card could realistically do.

    For Crysis etc. you *can* make it work but it makes no sense. Like I posted above, you will overpay by $500-$1000 at least and the laptop will not even be portable in the end. It will be as portable as a concrete brick that weighs 10 lbs.
    Reply
  • Imnotrichey - Wednesday, July 06, 2011 - link

    Thanks. Glad to know I can still play many of those steam games when at home without losing the mobility needed for work and school. Reply
  • seapeople - Wednesday, July 06, 2011 - link

    In comparison desktops are as portable as a concrete brick that weighs 20 pounds, a non-folding chair, and 20 feet of rope that you must tie around the chair and brick and then connect the other end to a tree every time you want to use it.

    I think I'll take my 10 pound brick, thank you very much.
    Reply
  • UMADBRO - Thursday, July 07, 2011 - link

    You seem awful full of yourself. Not everyone shares the same perspective as you, so stop acting like youre "opinion" is right and everyone else is wrong.

    And if 10 pounds is too damn heavy for you to move around, you have more serious problems to worry about then how much someone isomeone "should" spend on their system....
    Reply
  • Iketh - Thursday, July 07, 2011 - link

    what's with you and 10 lbs??? that isn't shit you pathetic weakling Reply
  • fb39ca4 - Wednesday, July 06, 2011 - link

    All of these chips (except for the gma in Atom netbooks) are faster than the quadro nvs 120m in mah laptop. ugh. Reply
  • Seikent - Wednesday, July 06, 2011 - link

    A chart comparing the graphics cards in some games would have been great. I know it is not so easy, but it is relevant. Reply
  • Drizzt321 - Wednesday, July 06, 2011 - link

    Where does the Quadro 1000m fit into here? I just bought the Lenovo w520 (still waiting to ship, arg!). I wasn't specifically looking for Quadro, however it had the rest I wanted (15", 1080p, Optimus, wide(er) gamut LCD, etc. Even a built-in color calibration sensor!) and it'd be interesting to see where it fits in here, give it's 96 cuda cores. Reply
  • Belard - Wednesday, July 06, 2011 - link

    Hey, let us know how that W520 works... the screen quality.

    Mobile graphics chips are not as powerful as their desktops, there are ways to make them run a bit cooler... such as lower clock rates and other factors.
    Reply
  • Drizzt321 - Thursday, July 07, 2011 - link

    Yea, I know, but since this was a, ya know, mobile graphics guide, I was figuring I'd ask for that since I'm not quite sure where it fits in performance-wise. But, didn't buy it for gaming, although I will probably do some, so it's not as big a deal to me.

    Well, supposedly it's pretty nice. From what I've read pretty good contrast ratio and good black/white levels. Plus 95% NTSC, at least according to specs, so I'm hopefully. Still a TN panel from what I understand, but at least it seems like a decent one. I'll try and remember to get back to the Anand forums on how it is. Hell, if Anand wants to give me some info on how to give all those wonderful graphs he has (I also have an X-Rite eyeOne USB colorimeter), I'd be more than happy to put together that info.
    Reply
  • DaveSimmons - Wednesday, July 06, 2011 - link

    It always annoys me seeing a laptop advertised with a "560m" or "6850 mobile". I have to stop and remind myself that no, these offer nothing close to the performance of the desktop cards with those model numbers that I've read about in reviews. Its just nvidia and AMD trying to play me for a sucker by selling me a mislabeled 550 or 5770. Reply
  • P_Turner - Wednesday, July 06, 2011 - link

    I feel the same way about Intel's mobile Sandy Bridge CPUs. I expect an i5 to have four physical cores without hyperthreading, but an i5-2410m has only two physical cores with hyperthreading.

    It turns out that the majority of mobile i5 parts have only two cores. Worse, even some of the mobile i7 CPUs are still just two cores.
    Reply
  • Belard - Wednesday, July 06, 2011 - link

    As a general guide, this is helpful and yeah, both AMD and Nvidia add their confusion.

    But for many people, basic graphics are fine for a notebook... but then again, its nice to have a bit MORE if you want to some gaming.

    An excellent site to keep track of notebook GPUs (and CPUs) that has amazing detail (a bit of overkill) and includes pretty much EVERY GPU ever existed (pre GeForce2go). goto notebookcheck.com

    Their charts list are here (GPUs on the right, first 3 items are most important) http://www.notebookcheck.net/FAQ-Tips-Technics.123...

    This is my favorite section: http://www.notebookcheck.net/Mobile-Graphics-Cards...

    Check out their notebook reviews, very very detailed - much like AnandTech 10~20 page review of a new technology item, power supply or video card. Warning thou, the side tends to default to German, but it comes in 10 languages - click the UK flag to read. :P
    Reply
  • kasakka - Thursday, July 07, 2011 - link

    Thanks for the info. Damn, my Macbook Pro's 9400M is pretty awful nowadays. Still enough for most things tho. Reply
  • Pirks - Thursday, July 07, 2011 - link

    My wife's 4 year old Dell Vostro is still too fast for her needs, and I paid like $1000 for it new. Judging by this there's gotta be about 4-6 more years of life left in it. Dell FTW! Reply
  • burntham77 - Wednesday, July 06, 2011 - link

    I have noticed that the only way to get a notebook with highend AMD graphics is to get one paired with an Intel CPU. I would love to be able to buy a notebook with a high end mobile AMD CPU and high end mobile AMD GPU. Reply
  • DanNeely - Thursday, July 07, 2011 - link

    Amd's mobile CPUs lag intel badly in performance, which is why they only show up in budget laptops. I don't know if they're enough slower to actually bottleneck the GPUs, but the fraction of gamers willing to take that large a CPU nerf to save $100ish on a $1500-2500 laptop isn't large enough to justify making the product. Reply
  • Roland00Address - Wednesday, July 06, 2011 - link

    It should look like this.

    160/240/320/400 (6380G/6480G/6520G/6620G) Shaders, 12/16/20 (6480G/6520G/6620G) TMUs, 4/8 (6520G/6480G and 6620G) ROPs, Core Clock: 400-444MHz

    You originally had the TMU and the ROPs backwards. The 6620g has higher TMU and more ROPs than the 6480G and the 6520G
    Reply
  • velanapontinha - Thursday, July 07, 2011 - link

    "20/16/12 (6480G/6520G/6620G) TMUs" - don't you mean "12/16/20" instead of "20/16/12"?

    Cheers,
    Fernando
    Reply
  • Meaker10 - Thursday, July 07, 2011 - link

    GT555M "C"
    144 CUDA Cores, 24 TMUs, 16 ROPs, Core clock 450mhz, Shader Clock: 900mhz, 128bit Memory Bus, DDR3, Effective memory clock 1600Mhz

    As used by clevo and Acer and called a 550M when used by Dell.
    Reply
  • Bolas - Thursday, July 07, 2011 - link

    Nvidia states that the GTX 580m supports 3D vision.

    Really? On which laptop? I have not been able to find a laptop that supports 3D Vision and uses this card. Someone help me out here and point me to a laptop with GTX 580m and 3D Vision 120Hz screen please.
    Reply
  • sidneyleejohnson - Sunday, July 10, 2011 - link

    In that regards:
    external note: All laptops with gtx580m would support 3d vision on an external monitor say your Samsung 2011 3d tv beamed via sibeam wirelesshd internal transmitter(m18x)
    2) but to meet your needs Clevo and Alienware has the answer:
    the P170HM3(Eurocom Neptune etc.) or the M17x

    as for dual 580m sli until m18x offers a 120hz option we'll have to wait for the
    and the SLI-equipped P270WN (ETA ~Nov(this is a "desktop CPU" laptop hexcore new motherboard/chipset)
    Reply
  • JNo - Friday, July 08, 2011 - link

    I too would like to voice that this subject matter is very important and great to see an article covering what is a very nebulous area. I'll always prefer the sheer grunt & value of a desktop but we can't ignore that laptops are now outselling them. And searching for notebooks which can play some games and have battery life is of great interest to me (and many others).

    Obviously, unlike desktops, you can't do true apples-to-apples comparisons with laptop gfx but nevertheless, it would be good to some performance comparison graphs into the article even though it would have to come with the caveat stating that they are only loosely comparable.

    Btw, don't know which GT 555M it is or even if a third type but this unbelievably beautiful 13" i7 nvid GT555M LG laptop (yes, i know, LG!) has what's designated a N12P variant of the 555M:

    http://www.engadget.com/2011/06/10/new-lg-p330-lap...
    Reply
  • mason.s - Friday, July 08, 2011 - link

    I was lucky enough to get a gem of a gaming laptop in the Acer Aspire AS5552G763. I snagged it while it was on a super deep discount, at $589. After gaming with it for a couple of months, I can say that the Radeon 6650M is a very capable GPU at the laptop's native res (an admittedly anemic 1366x768). I've gotta say I was truly blessed to pick up this machine at the price. You can check it out at the link I used to buy it. It's still at the lower-middle end of the price range for this kind of laptop at $790:

    http://www.buy.com/prod/acer-aspire-as5552g7632-15...
    Reply

Log in

Don't have an account? Sign up now