Meet the GTX 570

As we quickly touched upon in our introduction, the GTX 570 is a complete reuse of the GTX 580 design. NVIDIA used the same PCB, cooler, power throttling chips, and shroud as the GTX 580; our reference card is even clad nearly identical livery as the GTX 580. Indeed the only hardware difference between the two cards from the outside is that the GTX 580 uses 6+8pin PCIe power sockets, while the GTX 570 uses 6+6pin PCI power sockets.


Top: GTX 570. Bottom: GTX 580

By using the same design as the GTX 580 this no doubt has let NVIDIA bring the GTX 570 to the market quickly and cheaply, but it also means that the GTX 570 inherits the same design improvements that we saw on the GTX 580. This means the cooling for the GTX 570 is provided by a vapor chamber-based aluminum heatsink, backed by NVIDIA’s reinforced blower. For SLI users this also means that it’s using NVIDIA’s angled shroud that is supposed to allow for better cooling in tight spaces. As we’ve already seen on the GTX 580 this design can give the GTX 470 a run for its money, so it shouldn’t be a surprise when we say that the GTX 570 is similarly capable. Overall the only notable downside to this design is that because NVIDIA is using the GTX 580 design it’s also inheriting the GTX 580’s 10.5” length, making the GTX 570 an inch longer than the GTX 470.

As with the GTX 580 the situation with custom GTX 570s will be nebulous. NVIDIA is taking tighter control of the GTX 500 series and will only be approving designs that are equal or superior to the reference design. This isn’t a bad thing, but it means there’s less latitude for custom designs, particularly if someone wants to try lobbing an inch off of the card to make it the same length as the GTX 470. Interestingly, triple-slot coolers are also out – we found out last week that NVIDIA is vetoing them on the GTX 580 (and no doubt the GTX 570) as they aren’t suitable for use in SLI mode with most motherboards, so any custom designs that do appear will definitely be more conservative than what we’ve seen with the GTX 400 series.

Since NVIDIA is reusing the GTX 580 PCB, I/O is identical to the GTX 580. Here it’s covered by the usual NVIDIA configuration of 2x DVI ports and 1x mini-HDMI port, with the second slot occupied by the card’s exhaust.  This also means the card can only drive 2 of the 3 ports at once, meaning you’ll need an SLI configuration to take advantage of NVIDIA/3DVision Surround. Meanwhile HDMI 1.4a for 3D video purposes is supported by the card’s mini-HDMI port, but audio bitstreaming is not supported, limiting audio output to LPCM and DD+/DTS.

Index The Test
POST A COMMENT

54 Comments

View All Comments

  • xxtypersxx - Tuesday, December 07, 2010 - link

    If this thing can hit 900mhz it changes the price/performance picture entirely, why no overclock coverage in such a comprehensive review?

    Otherwise great write up as always!
    Reply
  • Bhairava - Tuesday, December 07, 2010 - link

    Yes good point. Reply
  • vol7ron - Tuesday, December 07, 2010 - link

    Why do graphics cards cost more than cpu+mobo these days?

    I know there's a different design process and maybe there isn't as much an economy of scale, but I'm just thinking about the days when it was reverse.
    Reply
  • Klinky1984 - Tuesday, December 07, 2010 - link

    Well you're essentially buying a computer on a card with a CPU these days. High performance GPU w/ high performance, pricey ram, all of which needs high quality power components to run. GPUs are now computers inside of computers. Reply
  • lowlymarine - Tuesday, December 07, 2010 - link

    I think it's simply that GPUs can't get cheaper to the extent that CPUs have, since the die sizes are so much larger. I certainly wouldn't say they're getting MORE expensive - I paid $370 for my 8800GTS back in early 2007, and $400 for a 6800 in early 2005 before that. Reply
  • DanNeely - Tuesday, December 07, 2010 - link

    High end GPU chips are much larger than high end CPUchips nowdays. The GF110 has 3bn transistors. For comparison a quadcore i7 only has 700m, and a 6 core athlon 900m, so you get 3 or 4 times as many CPUs from a wafer as you can GPUs. The quad core Itanic and octo core I7 are both around 2bn transistors but cost more than most gaming rigs for just the chip.

    GDDR3/5 are also significantly more expensive than the much slower DDR3 used by the rest of the computer.
    Reply
  • ET - Tuesday, December 07, 2010 - link

    They don't. A Core i7-975 costs way more than any graphics card. A GIGABYTE GA-X58A-UD9 motherboard costs $600 at Newegg. Reply
  • ET - Tuesday, December 07, 2010 - link

    Sorry, was short on time. I'll add that you forgot to consider the price of the very fast memory on high end graphics cards.

    I do agree, though, that a combination of mid-range CPU and board and high end graphics card is cost effective.
    Reply
  • mpschan - Wednesday, December 08, 2010 - link

    Don't forget that in a graphics card you're getting a larger chip with more processing power, a board for it to run on, AND memory. 1GB+ of ultra fast memory and the tech to get it to work with the GPU is not cheap.

    So your question needs to factory in cpu+mobo+memory, and even then it does not have the capabilities to process graphics at the needed rate.

    Generic processing that is slower at certain tasks will always be cheaper than specialized, faster processing that excels at said task.
    Reply
  • slagar - Wednesday, December 08, 2010 - link

    High end graphics cards were always very expensive. They're for enthusiasts, not the majority of the market.
    I think prices have come down for the majority of consumers. Mostly thanks to AMDs moves, budget cards are now highly competitive, and offer acceptable performance in most games with acceptable quality. I think the high end cards just aren't as necessary as they were 'back in the day', but then, maybe I just don't play games as much as I used to. To me, it was always the case that you'd be paying an arm and a leg to have an upper tier card, and that hasn't changed.
    Reply

Log in

Don't have an account? Sign up now