Meet the GTX 570

As we quickly touched upon in our introduction, the GTX 570 is a complete reuse of the GTX 580 design. NVIDIA used the same PCB, cooler, power throttling chips, and shroud as the GTX 580; our reference card is even clad nearly identical livery as the GTX 580. Indeed the only hardware difference between the two cards from the outside is that the GTX 580 uses 6+8pin PCIe power sockets, while the GTX 570 uses 6+6pin PCI power sockets.


Top: GTX 570. Bottom: GTX 580

By using the same design as the GTX 580 this no doubt has let NVIDIA bring the GTX 570 to the market quickly and cheaply, but it also means that the GTX 570 inherits the same design improvements that we saw on the GTX 580. This means the cooling for the GTX 570 is provided by a vapor chamber-based aluminum heatsink, backed by NVIDIA’s reinforced blower. For SLI users this also means that it’s using NVIDIA’s angled shroud that is supposed to allow for better cooling in tight spaces. As we’ve already seen on the GTX 580 this design can give the GTX 470 a run for its money, so it shouldn’t be a surprise when we say that the GTX 570 is similarly capable. Overall the only notable downside to this design is that because NVIDIA is using the GTX 580 design it’s also inheriting the GTX 580’s 10.5” length, making the GTX 570 an inch longer than the GTX 470.

As with the GTX 580 the situation with custom GTX 570s will be nebulous. NVIDIA is taking tighter control of the GTX 500 series and will only be approving designs that are equal or superior to the reference design. This isn’t a bad thing, but it means there’s less latitude for custom designs, particularly if someone wants to try lobbing an inch off of the card to make it the same length as the GTX 470. Interestingly, triple-slot coolers are also out – we found out last week that NVIDIA is vetoing them on the GTX 580 (and no doubt the GTX 570) as they aren’t suitable for use in SLI mode with most motherboards, so any custom designs that do appear will definitely be more conservative than what we’ve seen with the GTX 400 series.

Since NVIDIA is reusing the GTX 580 PCB, I/O is identical to the GTX 580. Here it’s covered by the usual NVIDIA configuration of 2x DVI ports and 1x mini-HDMI port, with the second slot occupied by the card’s exhaust.  This also means the card can only drive 2 of the 3 ports at once, meaning you’ll need an SLI configuration to take advantage of NVIDIA/3DVision Surround. Meanwhile HDMI 1.4a for 3D video purposes is supported by the card’s mini-HDMI port, but audio bitstreaming is not supported, limiting audio output to LPCM and DD+/DTS.

Index The Test
Comments Locked

54 Comments

View All Comments

  • TheHolyLancer - Tuesday, December 7, 2010 - link

    likely because when the 6870s came out they included an FTW edition of the 460 and was hammered? Not to mention in their own guild lines they said no OCing in launch articles.

    If they do do OC comp, most likely in a special article, possibly with retail brought samples rather than sent demos...
  • Ryan Smith - Tuesday, December 7, 2010 - link

    As a rule of thumb I don't do overclock testing with a single card, as overclocking is too variable. I always wait until I have at least 2 cards to provide some validation to our results.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    I don't understand why so many cards still cling to DVI. Seeing that Nvidia is at least including native HDMI on their recent generations of cards is nice, but why, in 2010, on an enthusiast-level graphics card, are they not pushing the envelope with newer standards?

    The fact that AMD includes DVI, HDMI, and DisplayPort natively on their newer lines of cards is probably what's going to sway my purchasing decision this holiday season. Something about having all of these small, elegant, plug-in connectors and then one massive screw-in connector just irks me.
  • Vepsa - Tuesday, December 7, 2010 - link

    Its because most people still have DVI for their desktop monitors.
  • ninjaquick - Tuesday, December 7, 2010 - link

    DVI is a very good plug man, I don't see why you're hating on it.
  • ninjaquick - Tuesday, December 7, 2010 - link

    I meant to reply to OP.
  • DanNeely - Tuesday, December 7, 2010 - link

    Aside from apple almost noone uses DP. Assuming it wasn't too late in the life cycle to do so, I suspect that the new GPU used in the 6xx series of cards next year will have DP support so nvidia can offer many display gaming on a single card, but only because a single DP clockgen (shared by all DP displays) is cheaper to add than 4 more legacy clockgens (one needed per VGA/DVI/HDMI display).
  • Taft12 - Tuesday, December 7, 2010 - link

    Market penetration is just a bit more important than your "elegant connector" for an input nobody's monitor has. What a poorly thought-out comment.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    Market penetration starts by companies supporting the "cutting edge" of technology. DisplayPort has a number of advantages over DVI, most of which would be beneficial to Nvidia in the long run, especially considering the fact that they're pushing the multi-monitor / combined resolution envelope just like AMD.

    Perhaps if you only hold on to a graphics card for 12-18 months, or keep a monitor for many years before finally retiring it, the connectors your new $300 piece of technology provides won't matter to you. If you're like me and tend to keep a card for 2+ years while jumping on great monitor deals every few years as they come up, it's a different ballgame. I've had DisplayPort-capable monitors for about 2 years now.
  • Dracusis - Tuesday, December 7, 2010 - link

    I invested just under $1000 in a 30" professional 8-bit PVA LCD back in 2006 that is still better than 98% of the crappy 6-bit TN panels on the market. It has been used with 4 different video cards, supports DVI, VGA, Component HD and Composite SD. Has an ultra wide color gamut (113%), great contrast, matt screen with super deep blacks and perfectly uniform backlighting along with mem card readers and USB ports.

    Display Port, not any other monitor on the market offers me absolutely nothing new or better in terms of visual quality or features.

    If you honestly see an improvement in quality spending $300 ever 18 months on a new "value" displays then I feel sorry for you, you've made some poorly informed choices and wasted a lot of money.

Log in

Don't have an account? Sign up now