POST A COMMENT

20 Comments

Back to Article

  • Glibous - Monday, November 07, 2011 - link

    Sounds like Nvidia is really becoming competitive in the mobile space regarding price. The one major element AMD had an advantage in. I wonder if Optimus works in SLI yet. Reply
  • JarredWalton - Monday, November 07, 2011 - link

    No Optimus for SLI; I'm not sure if NVIDIA is even pursuing that at all, as SLI in notebooks is a pretty small market and results in very heavy systems. We'll see, I suppose. Reply
  • GoodBytes - Monday, November 07, 2011 - link

    Agree with JarredWalton.
    Also, people with SLI seek maximum gaming performance, Optimus kills the system bus access (as it puts all it's done in the memory section of the Intel GPU), which mean a nice reduction in CPU performance.

    I think, that Optimus is great for med-low range GPU's.
    Reply
  • PellyNVIDIA - Monday, November 07, 2011 - link

    Actually, Optimus does not kill the system bus access. In fact, you can benchmark a GeForce GTX 580M (flagship GPU) with and without Optimus and you'll find no difference! Optimus is an ideal technology for entry-level GPUs all the way up to the fastest GPU on the planet. With the GeForce GTX 580M and Optimus, systems like the Alienware M17x R3 are able to get over 5hrs of battery life (and offer absolutely phenomenal performance).

    For a full description of how Optimus works, you can read the Optimus whitepaper I wrote back when we launched the technology.

    www.nvidia.com/object/LO_optimus_whitepapers.html

    With thanks,

    Sean Pelletier
    Senior Technical Marketing Manager - Notebooks
    NVIDIA
    Reply
  • Dustin Sklavos - Tuesday, November 08, 2011 - link

    I can actually corroborate Pelly's statement. Anecdotally I've found no perceptible performance difference between a GTX 580M with Optimus and without it. Reply
  • tviceman - Monday, November 07, 2011 - link

    It's been rumored over the past few months that Nvidia was die shrinking it's Fermi mobile offerings, and a large price cut might signal that they are now trying to clear existing inventory for upcoming parts. I know this is just a theory based on a rumor, but if they are coming out with new mobile parts soon, then a price cut for current stock is indicative of that. Reply
  • Hrel - Monday, November 07, 2011 - link

    Charging 475 instead of 675 for a 200 dollar GPU isn't a price cut. That's like saying you shot someone 75 times instead of NUKING them and somehow they're less dead now. No, FUCK YOU Nvidia, drop your fucking profit margins so GPU's on laptops are reasonable. The GTX560M (which I have btw) performs like a 90 dollar graphics card. Now, I can understand SOME price premium for laptop parts, (even though there's less to them (no pcb or fan or massive heatsink) AND even though now-a-days they're higher volume) but seriously, the GTX560m should cost AT THE MOST, THE MOST I SAY, 150. Yet everywhere I see it as an add on it's much much more than that. I haven't even seen a 150 add on, meaning there's the profit already there, PLUS more than 150.

    I'm just getting exhausted with the price gauging bullshit. I make good money, but I also work in the industry. I know what this stuff actually costs, marketing, r and d, parts, labor, all of it. The margins on this stuff is criminal in almost all cases. It's like the RAM price gouging of the early 2000's. Fucking criminal.
    Reply
  • chinedooo - Monday, November 07, 2011 - link

    i think its because of supply and demand that the prices are so high. Reply
  • JarredWalton - Monday, November 07, 2011 - link

    These are likely heavily binned GPUs, which means they are not as readily available. Selecting CPUs/GPUs for lower voltages and power characteristics and charging more for the best chips is no different from selecting chips for maximum clock speeds (e.g. "Extreme" CPUs). If you don't like the pricing of course, the best way to vote is to not buy the product. Reply
  • MrSpadge - Tuesday, November 08, 2011 - link

    Exactly. Making a 250 W chip sip only 50 - 75 W without crippling it is not exactly easy. Reply
  • RussianSensation - Monday, November 07, 2011 - link

    Well to be fair in all honesty, gaming laptops never represented good value. So if you wanted the best value, you'd be gaming on the laptop. At the same time, I am pretty sure a good gaming laptop today costs less than it did 5 or 7 years ago. Also, with modern Intel CPUs based on Nehalem/Lynnfield or Sandy Bridge generation, your laptop doesn't get obsolete as fast.

    I understand that you are frustrated since generally speaking high-end laptop GPUs are only about as fast as a $150-200 mid-range desktop GPU. Still, this has pretty much been the case for both NV and AMD for years now.

    You can't have a company selling you parts for barely any profits. You purchased the GTX560M laptop despite it being a worse value than say a GTX560Ti on the desktop, correct? NV is simply meeting market demand with its pricing. If no one bought their cards at their prices, they would have been forced to lower them.
    Reply
  • RussianSensation - Monday, November 07, 2011 - link

    I meant if you wanted the "best value", you would have been gaming on a *desktop*. Reply
  • Sunsmasher - Monday, November 07, 2011 - link

    Less swearing please. It's not necessary in order to make your point here. Reply
  • Dustin Sklavos - Tuesday, November 08, 2011 - link

    I decided to go dig up some facts.

    First, the GTX 580M is basically a downclocked desktop GeForce GTX 560 Ti. Okay.

    The 560 Ti has a TDP of 170W, with a core clock of 822MHz and GDDR5 clock of 4GHz. The GTX 580M, by comparison, has a TDP of 100W, with a core clock of 620MHz and GDDR5 clock of 3GHz. That means that to get the GF114 to run at ~60% of the TDP of its desktop counterpart, it's running at 75% of the core and memory clocks, AND that's before you take into account that it also has twice the GDDR5. While I wouldn't expect linear scaling, it's reasonable to think there's some heavy duty binning going on here. You're also minimizing the difference in cost; the 580M's upgrade price dropped by at LEAST $300.

    Okay, so what about the GTX 560M that you're kvetching about? Well, that's basically a desktop GeForce GTX 550 Ti. The 550 Ti has a TDP of 116W (and I'm keen to point out that the 100W GTX 580M thoroughly outclasses it), with a core clock of 900MHz and GDDR5 clock of 4.1GHz. Meanwhile, its mobile counterpart, the GTX 560M, operates with a core clock of 775MHz and GDDR5 clock of 2.5GHz. There's no published TDP spec for the 560M, but it's estimated to be about 75W. So for about ~65% of the desktop part's TDP, you get 86% of the core clock and 60% of the GDDR5 clock. Once again, it's reasonable to think these are also being binned.

    The prices on notebook GPUs are always going to be silly compared to their desktop cousins. But the desktop chips don't have to fit into tight thermal envelopes the way their mobile counterparts do. Look at the math: it's a lot easier to make a desktop 550 Ti than it is to make a GTX 560M.
    Reply
  • geniekid - Tuesday, November 08, 2011 - link

    Thanks for that :) Reply
  • PellyNVIDIA - Tuesday, November 08, 2011 - link

    Exactly! Thank you for posting that Dustin!

    A quick look at the desktop GeForce GTX 580 shows a card that is ~8" long, has a dual-slot heatsink assembly, requires a 700W+ PSU, has two power connectors in addition to getting power via PCIE, etc.

    In stark contrast, the notebook GeForce GTX 580M module is ~1/4th the size of the desktop PCB, features a low-profile heatsink, fits in a 100W power envelope, no additional power connectors, etc.

    As Dustin points out above, the characteristics necessary to hit the right clocks for the notebook parts are largely different from those that are used to create the desktop parts. You only get a certain amount of these particular chips per wafer, so it all comes down to supply and demand.
    Reply
  • aguilpa1 - Tuesday, November 08, 2011 - link

    So why call it a GTX 580m at all??? If we all agree it is no were near the specifications of a true GTX 580. Call it what it is a 560m or a 550m. I would even be ok with 565m or 555m but making it out to be a product that it obviously is not is the major issue here. It has nothing to do with binning and the laws of thermodynamics, whatever. I wouldn't give either Nvidia nor AMD a free ride on this. Their naming of products is misleading and deceitful all in the name of charging more.

    Would you pay a lot more for a 200Hp motor (and called the V8 Hemi) if it was put in a small compact car and only had 6 cylinders? If in fact you knew it was neither a V8 nor a Hemi and the real one produced 350Hp. This is just an example as I am not sure of the actual horsepower of a true V8 Hemi. Would you then argue that because you had to shove such a big and powerful motor in a small chassis that you were then justified in charging several times more?
    Reply
  • PellyNVIDIA - Tuesday, November 08, 2011 - link

    NVIDIA uses the naming convention to convey performance within a family of GPUs.

    "GTX" signifies the enthusiast segment and the numerical value indicates overall performance. (ie: 580 is faster than 570, etc.)

    For the 5xx family (both desktop and notebook), the GTX 580 brand designates the fastest GPU. Without question, the GeForce GTX 580M is the fastest notebook GPU we (or any other vendor) offers. With that in mind, the GPU is deserving of the GTX 580M designation to indicate the fact it offers the best possible performance for a notebook.
    Reply
  • nerrawg - Wednesday, November 09, 2011 - link

    Actually both you (NVIDIA) and AMD do deserves some flak for your use of naming schemes. While we enthusiasts always know the differences in performance (not to mention stream processor counts, shader clocks etc. etc.) the general customer does not and there has been much "Tomfoolery" going on in this regard. To mention a few examples:

    NVIDIA:

    the desktop GTS 250 was actually considerably faster than the GTS 450 (we know why of course) but following your logic above this card should have been 2 generations ahead (we know it was only one) and therefore expected to be significantly faster (just compare the GTX 280 to the current GTX 580)

    AMD:
    The 5850 was faster than the 6850, same for the 5870 and 6870 - confusing to those out of the "know".
    Several models have same performance
    5750-6750 5770-6770
    NVIDIA does this as well, but with their mobile GPUs

    Shame on you both for misleading all those sheep out there! Your the reason people are always asking us (enthusiasts) what to buy! (it gets annoying after a while)
    Reply
  • therealnickdanger - Wednesday, November 09, 2011 - link

    A fool and his money...

    Yeah, I find the whole process deceptive. If the underlying hardware is really a 560, then it IS a 560, not a 580. ATI, NVIDIA are both irresponsible in this regard.

    The excuse that "it's a marketing decision" is essentially an excuse to lie to the customer, not convey an accurate depiction of performance. The people that want "top of the line" performance will buy the most expensive GPU offerred whether it says 580 or 560, so I believe it would be responsible to name it realistically.

    But hey, y'know, whatever.
    Reply

Log in

Don't have an account? Sign up now