Those of you in the market for an Alienware M17x R3 or M18x (and if you're looking for a gaming notebook, you should really be considering the M17x R3), NVIDIA has some great news for you. The GeForce GTX 580M, which in our testing competes with the AMD Radeon HD 6990M as the fastest mobile GPU on the market (and tends to split the difference), has received a drastic price cut over at Alienware that suddenly makes it much, much more competitive.

Where before it was hard to justify the upgrade due to the outrageous $300+ premium for the GTX 580M, Alienware is now selling it at just $75 more than the 6990M in the M17x R3. That change in price also lines up with the M18x, where a pair of GTX 580Ms in SLI is $150 more than a pair of 6990Ms in CrossFire. The two solutions are essentially still neck and neck, but if you have need for CUDA support (like in Adobe Premiere CS5.5), want to enjoy PhysX (for Batman: Arkham City), or would rather use Optimus than manual graphics switching, the 580M is now a much more reasonable option.

Unfortunately this seems to apply only to Alienware; boutiques offering Clevo-based systems seem to still be dealing with the old pricing, which places an unreasonable premium on the GTX 580M. So if you're shopping those, it looks like your best bet is still going to be the AMD Radeon HD 6990M.

POST A COMMENT

20 Comments

View All Comments

  • RussianSensation - Monday, November 07, 2011 - link

    Well to be fair in all honesty, gaming laptops never represented good value. So if you wanted the best value, you'd be gaming on the laptop. At the same time, I am pretty sure a good gaming laptop today costs less than it did 5 or 7 years ago. Also, with modern Intel CPUs based on Nehalem/Lynnfield or Sandy Bridge generation, your laptop doesn't get obsolete as fast.

    I understand that you are frustrated since generally speaking high-end laptop GPUs are only about as fast as a $150-200 mid-range desktop GPU. Still, this has pretty much been the case for both NV and AMD for years now.

    You can't have a company selling you parts for barely any profits. You purchased the GTX560M laptop despite it being a worse value than say a GTX560Ti on the desktop, correct? NV is simply meeting market demand with its pricing. If no one bought their cards at their prices, they would have been forced to lower them.
    Reply
  • RussianSensation - Monday, November 07, 2011 - link

    I meant if you wanted the "best value", you would have been gaming on a *desktop*. Reply
  • Sunsmasher - Monday, November 07, 2011 - link

    Less swearing please. It's not necessary in order to make your point here. Reply
  • Dustin Sklavos - Tuesday, November 08, 2011 - link

    I decided to go dig up some facts.

    First, the GTX 580M is basically a downclocked desktop GeForce GTX 560 Ti. Okay.

    The 560 Ti has a TDP of 170W, with a core clock of 822MHz and GDDR5 clock of 4GHz. The GTX 580M, by comparison, has a TDP of 100W, with a core clock of 620MHz and GDDR5 clock of 3GHz. That means that to get the GF114 to run at ~60% of the TDP of its desktop counterpart, it's running at 75% of the core and memory clocks, AND that's before you take into account that it also has twice the GDDR5. While I wouldn't expect linear scaling, it's reasonable to think there's some heavy duty binning going on here. You're also minimizing the difference in cost; the 580M's upgrade price dropped by at LEAST $300.

    Okay, so what about the GTX 560M that you're kvetching about? Well, that's basically a desktop GeForce GTX 550 Ti. The 550 Ti has a TDP of 116W (and I'm keen to point out that the 100W GTX 580M thoroughly outclasses it), with a core clock of 900MHz and GDDR5 clock of 4.1GHz. Meanwhile, its mobile counterpart, the GTX 560M, operates with a core clock of 775MHz and GDDR5 clock of 2.5GHz. There's no published TDP spec for the 560M, but it's estimated to be about 75W. So for about ~65% of the desktop part's TDP, you get 86% of the core clock and 60% of the GDDR5 clock. Once again, it's reasonable to think these are also being binned.

    The prices on notebook GPUs are always going to be silly compared to their desktop cousins. But the desktop chips don't have to fit into tight thermal envelopes the way their mobile counterparts do. Look at the math: it's a lot easier to make a desktop 550 Ti than it is to make a GTX 560M.
    Reply
  • geniekid - Tuesday, November 08, 2011 - link

    Thanks for that :) Reply
  • PellyNVIDIA - Tuesday, November 08, 2011 - link

    Exactly! Thank you for posting that Dustin!

    A quick look at the desktop GeForce GTX 580 shows a card that is ~8" long, has a dual-slot heatsink assembly, requires a 700W+ PSU, has two power connectors in addition to getting power via PCIE, etc.

    In stark contrast, the notebook GeForce GTX 580M module is ~1/4th the size of the desktop PCB, features a low-profile heatsink, fits in a 100W power envelope, no additional power connectors, etc.

    As Dustin points out above, the characteristics necessary to hit the right clocks for the notebook parts are largely different from those that are used to create the desktop parts. You only get a certain amount of these particular chips per wafer, so it all comes down to supply and demand.
    Reply
  • aguilpa1 - Tuesday, November 08, 2011 - link

    So why call it a GTX 580m at all??? If we all agree it is no were near the specifications of a true GTX 580. Call it what it is a 560m or a 550m. I would even be ok with 565m or 555m but making it out to be a product that it obviously is not is the major issue here. It has nothing to do with binning and the laws of thermodynamics, whatever. I wouldn't give either Nvidia nor AMD a free ride on this. Their naming of products is misleading and deceitful all in the name of charging more.

    Would you pay a lot more for a 200Hp motor (and called the V8 Hemi) if it was put in a small compact car and only had 6 cylinders? If in fact you knew it was neither a V8 nor a Hemi and the real one produced 350Hp. This is just an example as I am not sure of the actual horsepower of a true V8 Hemi. Would you then argue that because you had to shove such a big and powerful motor in a small chassis that you were then justified in charging several times more?
    Reply
  • PellyNVIDIA - Tuesday, November 08, 2011 - link

    NVIDIA uses the naming convention to convey performance within a family of GPUs.

    "GTX" signifies the enthusiast segment and the numerical value indicates overall performance. (ie: 580 is faster than 570, etc.)

    For the 5xx family (both desktop and notebook), the GTX 580 brand designates the fastest GPU. Without question, the GeForce GTX 580M is the fastest notebook GPU we (or any other vendor) offers. With that in mind, the GPU is deserving of the GTX 580M designation to indicate the fact it offers the best possible performance for a notebook.
    Reply
  • nerrawg - Wednesday, November 09, 2011 - link

    Actually both you (NVIDIA) and AMD do deserves some flak for your use of naming schemes. While we enthusiasts always know the differences in performance (not to mention stream processor counts, shader clocks etc. etc.) the general customer does not and there has been much "Tomfoolery" going on in this regard. To mention a few examples:

    NVIDIA:

    the desktop GTS 250 was actually considerably faster than the GTS 450 (we know why of course) but following your logic above this card should have been 2 generations ahead (we know it was only one) and therefore expected to be significantly faster (just compare the GTX 280 to the current GTX 580)

    AMD:
    The 5850 was faster than the 6850, same for the 5870 and 6870 - confusing to those out of the "know".
    Several models have same performance
    5750-6750 5770-6770
    NVIDIA does this as well, but with their mobile GPUs

    Shame on you both for misleading all those sheep out there! Your the reason people are always asking us (enthusiasts) what to buy! (it gets annoying after a while)
    Reply
  • therealnickdanger - Wednesday, November 09, 2011 - link

    A fool and his money...

    Yeah, I find the whole process deceptive. If the underlying hardware is really a 560, then it IS a 560, not a 580. ATI, NVIDIA are both irresponsible in this regard.

    The excuse that "it's a marketing decision" is essentially an excuse to lie to the customer, not convey an accurate depiction of performance. The people that want "top of the line" performance will buy the most expensive GPU offerred whether it says 580 or 560, so I believe it would be responsible to name it realistically.

    But hey, y'know, whatever.
    Reply

Log in

Don't have an account? Sign up now