Final Thoughts

If we took the conclusion from our GeForce GTX 580 article and replaced 580, 480, and 6870CF with 570, 470, and 6850CF respectively, our final thoughts would be almost identical. But then the GTX 580 and GTX 570 are almost identical too.

Whereas the GTX 580 took a two-tiered approach on raising the bar on GPU performance while simultaneously reducing power consumption, the GeForce GTX 570 takes a much more single-tracked approach. It is for all intents and purposes the new GTX 480, offering gaming performance virtually identical to the GTX 480 at a lower price, and with less power consumption along with lower temperatures and less noise. As a lower tier GF110 card the GTX 570 won’t wow the world with its performance, but like the GTX 580 it’s a solid step forward. In this case it’s a solid step towards bringing yesterday’s performance to the market at a lower price and with power/thermal/noise characteristics better suited for more systems. If nothing else, NVIDIA has translated the GTX 580’s excellent balance of performance and noise to a lower priced, lower performing tier.

Furthermore at $350 NVIDIA is the only game in town for single-GPU cards for the time being. Until an AMD competitor comes along NVIDIA has done a good job of filling the gap between the GTX 580 and GTX 470, an action very reminiscent of the GTX 470 and how it filled the gap between the Radeon HD 5870 and 5850 earlier this year. With no single card alternative on the market right now the only competition is the GeForce GTX 460 1GB SLI and the Radeon HD 6850 CF. The Radeon in particular should not be underestimated – it can trounce the GTX 570 almost at will – however it’s dogged by the fact that 6850 prices are running high right now, putting it at a $30+ price premium over the GTX 570. And of course both multi-GPU solutions face the usual caveats of uneven performance scaling, more noise, and a reliance on driver updates to unlock the 2nd GPU on new games. As with the GTX 580 we’d pick the simplicity of a single-GPU setup over the potential performance advantages of a multi-GPU setup, but this is as always a personal decision.

As a gap-filler the GTX 570 is largely what we expected the moment we saw the GTX 580 and we have no serious qualms with it. The one thing that does disappoint us is that NVIDIA is being conservative with the pricing: $350 is not aggressive pricing. The GTX 570 is fast enough to justify its position and the high-end card price premium, but at $100 over the GTX 470 and Radeon HD 5870 you’re paying a lot for that additional 20-25% in performance. Certainly we’re going to be happy campers if AMD’s next series of cards can put some pressure on NVIDIA here.

And finally, that brings us to AMD. AMD’s schedule calls for the Radeon HD 6900 series to be launched by the end of the year, and the year is quickly running out. There’s still too much uncertainty to advise holding off on any GTX 500 series purchases (particularly if you expect to have a card for Christmas), but if you’re not in a rush for a card it could be worth waiting a couple more weeks to see what AMD has up their sleeves. A holiday slugfest between AMD and NVIDIA and the resulting price drops are certainly at the top of our wish lists.

Power, Temperature, & Noise
Comments Locked

54 Comments

View All Comments

  • TheHolyLancer - Tuesday, December 7, 2010 - link

    likely because when the 6870s came out they included an FTW edition of the 460 and was hammered? Not to mention in their own guild lines they said no OCing in launch articles.

    If they do do OC comp, most likely in a special article, possibly with retail brought samples rather than sent demos...
  • Ryan Smith - Tuesday, December 7, 2010 - link

    As a rule of thumb I don't do overclock testing with a single card, as overclocking is too variable. I always wait until I have at least 2 cards to provide some validation to our results.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    I don't understand why so many cards still cling to DVI. Seeing that Nvidia is at least including native HDMI on their recent generations of cards is nice, but why, in 2010, on an enthusiast-level graphics card, are they not pushing the envelope with newer standards?

    The fact that AMD includes DVI, HDMI, and DisplayPort natively on their newer lines of cards is probably what's going to sway my purchasing decision this holiday season. Something about having all of these small, elegant, plug-in connectors and then one massive screw-in connector just irks me.
  • Vepsa - Tuesday, December 7, 2010 - link

    Its because most people still have DVI for their desktop monitors.
  • ninjaquick - Tuesday, December 7, 2010 - link

    DVI is a very good plug man, I don't see why you're hating on it.
  • ninjaquick - Tuesday, December 7, 2010 - link

    I meant to reply to OP.
  • DanNeely - Tuesday, December 7, 2010 - link

    Aside from apple almost noone uses DP. Assuming it wasn't too late in the life cycle to do so, I suspect that the new GPU used in the 6xx series of cards next year will have DP support so nvidia can offer many display gaming on a single card, but only because a single DP clockgen (shared by all DP displays) is cheaper to add than 4 more legacy clockgens (one needed per VGA/DVI/HDMI display).
  • Taft12 - Tuesday, December 7, 2010 - link

    Market penetration is just a bit more important than your "elegant connector" for an input nobody's monitor has. What a poorly thought-out comment.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    Market penetration starts by companies supporting the "cutting edge" of technology. DisplayPort has a number of advantages over DVI, most of which would be beneficial to Nvidia in the long run, especially considering the fact that they're pushing the multi-monitor / combined resolution envelope just like AMD.

    Perhaps if you only hold on to a graphics card for 12-18 months, or keep a monitor for many years before finally retiring it, the connectors your new $300 piece of technology provides won't matter to you. If you're like me and tend to keep a card for 2+ years while jumping on great monitor deals every few years as they come up, it's a different ballgame. I've had DisplayPort-capable monitors for about 2 years now.
  • Dracusis - Tuesday, December 7, 2010 - link

    I invested just under $1000 in a 30" professional 8-bit PVA LCD back in 2006 that is still better than 98% of the crappy 6-bit TN panels on the market. It has been used with 4 different video cards, supports DVI, VGA, Component HD and Composite SD. Has an ultra wide color gamut (113%), great contrast, matt screen with super deep blacks and perfectly uniform backlighting along with mem card readers and USB ports.

    Display Port, not any other monitor on the market offers me absolutely nothing new or better in terms of visual quality or features.

    If you honestly see an improvement in quality spending $300 ever 18 months on a new "value" displays then I feel sorry for you, you've made some poorly informed choices and wasted a lot of money.

Log in

Don't have an account? Sign up now