Meet the GTX 580

Since we’ve already discussed the cooling, let’s dive right in to the rest of the GTX 580, shall we?

Launching today will be a single GTX 580 design, the reference design. Talking to our contacts, semi-custom designs (designs using the reference PCB with a different cooler) are due in the next few weeks assuming everything goes to plan and of course there’s ample supply. And while we’re on that note, NVIDIA let us know that with their focus on cooling on the GTX 580 they aren’t going to be letting custom GTX 580 designs go out without a more thorough inspection. The acoustic performance of the reference GTX 580 is going to be the bare minimum to get a design approved – if it can’t beat the reference design, NVIDIA won’t allow it. We consider this a matter of brand protection for the company, as a bad/loud GeForce is still a GeForce all the same.

Top: GTX 480. Bottom: GTX580

With the reference design the resulting card is very close to being a GTX 285/480 hybrid. In terms of overall design it ends up looking very similar to the GTX 285. At 10.5” long it’s the same length as the GTX 480 or a smidge longer than the GTX 285, and should fit in to any cases those cards could work in. Power connectivity is the same as the GTX 480, with 6pin and 8pin PCIe sockets being located at the top of the card, providing easy access to the sockets. At 244W TDP the card draws too much for 6+6, but you can count on an eventual GTX 570 to fill that niche.  Meanwhile NVIDIA has kept the 480’s detachable shroud lid, meaning you can remove the cover of the shroud without disturbing the rest of the card; it’s worth noting that it’s secured with screws rather than laches this time however.


Heatsinks Exposed! Top: GTX 480. Bottom: GTX 580

On the front side of the PCB you’ll find the 12 GDDR5 chips composing the card’s 384bit memory bus. The thermal pads connecting the memory to the shroud have once again wiped out the chip markings, so we haven’t been able to determine what these chips are, although we’re confident they’re 5Gbps like in past cards. At the center of the card is the GF110 GPU encased in a metal heatspreader, a common sight for NVIDIA’s high-end GPUs. This is an A1 revision GPU, which in NVIDIA’s counting system means it’s the first tape-out. Elsewhere on the board you’ll find the 2 SLI connectors, providing support for tri-SLI on the 580. All told while the GPU has been refined, the PCB remains largely unchanged from the GTX 480 other than removing the ventilation holes – all of the ICs are in practically the same place, and even the VRM controller is the same.

Meanwhile looking at the IO bracket for the 580, we find the same configuration as we saw on the 480. Below a full-sized vent are 2 DVI ports and a mini-HDMI port. NVIDIA slightly revised their display controller for GF110/GTX580; the good news is that HDMI 1.4a is supported, the bad news is that full audio bitstreaming is not supported so it’s the same as it was on the GTX 480: 8 channel LPCM and lossy audio formats like DD+ and DTS. This actually caught us off-guard since we were expecting the full GF104 treatment here, but it just goes to show that this is a GF100-derrivative after all. Unfortunately this also extends to the number of displays supported – NVIDIA still only supports 2 displays on one card, so you need to run in SLI if you intend to take advantage of 3DVision/NVIDIA surround across 3 monitors.

Finally, it’s with some sense of irony that we find ourselves yelling more at AMD than NVIDIA for naming shenanigans this time around, considering it was NVIDIA that brought us the 8800GT/9800GT and GeForce 200/300 product naming snafus. While NVIDIA has made some changes compared to the GTX 480, it’s a very short list; shorter than even AMD’s list for the 6800 series. At the same time, at least the GTX 580 is faster than the GTX 480 versus AMD’s 6800/5800 series. Quite frankly the GTX 580 should be the GTX 485 – the few architectural changes we’ve seen do make a difference, but then NVIDIA did a whole die shrink on the GTX 280 and only got a GTX 285 out of it. Both companies seem committed to coming out with a new family of video cards this year regardless of where the GPU powering them has actually changed. Ultimately the GTX 580 is the second flimsiest excuse for a new series number, next only to simply rebranding an existing GPU.

Keeping It Cool: Transistors, Throttles, and Coolers The Test
Comments Locked

160 Comments

View All Comments

  • Taft12 - Tuesday, November 9, 2010 - link

    In this article, Ryan does exactly what you are accusing him of not doing! It is you who need to be asked WTF is wrong
  • Iketh - Thursday, November 11, 2010 - link

    ok EVERYONE belonging to this thread is on CRACK... what other option did AMD have to name the 68xx? If they named them 67xx, the differences between them and 57xx are too great. They use nearly as little power as 57xx yet the performance is 1.5x or higher!!!

    im a sucker for EFFICIENCY... show me significant gains in efficiency and i'll bite, and this is what 68xx handily brings over 58xx

    the same argument goes for 480-580... AT, show us power/performance ratios between generations on each side, then everyone may begin to understand the naming

    i'm sorry to break it to everyone, but this is where the GPU race is now, in efficiency, where it's been for cpus for years
  • MrCommunistGen - Tuesday, November 9, 2010 - link

    Just started reading the article and I noticed a couple of typos on p1.

    "But before we get to deep in to GF110" --> "but before we get TOO deep..."

    Also, the quote at the top of the page was placed inside of a paragraph which was confusing.
    I read: "Furthermore GTX 480 and GF100 were clearly not the" and I thought: "the what?". So I continued and read the quote, then realized that the paragraph continued below.
  • MrCommunistGen - Tuesday, November 9, 2010 - link

    well I see that the paragraph break has already been fixed...
  • ahar - Tuesday, November 9, 2010 - link

    Also, on page 2 if Ryan is talking about the lifecycle of one process then "...the processes’ lifecycle." is wrong.
  • Aikouka - Tuesday, November 9, 2010 - link

    I noticed the remark on Bitstreaming and it seems like a logical choice *not* to include it with the 580. The biggest factor is that I don't think the large majority of people actually need/want it. While the 580 is certainly quieter than the 480, it's still relatively loud and extraneous noise is not something you want in a HTPC. It's also overkill for a HTPC, which would delegate the feature to people wanting to watch high-definition content on their PC through a receiver, which probably doesn't happen much.

    I'd assume the feature could've been "on the board" to add, but would've probably been at the bottom of the list and easily one of the first features to drop to either meet die size (and subsequently, TDP/Heat) targets or simply to hit their deadline. I certainly don't work for nVidia so it's really just pure speculation.
  • therealnickdanger - Tuesday, November 9, 2010 - link

    I see your points as valid, but let me counterpoint with 3-D. I think NVIDIA dropped the ball here in the sense that there are two big reasons to have a computer connected to your home theater: games and Blu-ray. I know a few people that have 3-D HDTVs in their homes, but I don't know anyone with a 3-D HDTV and a 3-D monitor.

    I realize how niche this might be, but if the 580 supported bitstreaming, then it would be perfect card for anyone that wants to do it ALL. Blu-ray, 3-D Blu-Ray, any game at 1080p with all eye-candy, any 3-D game at 1080p with all eye-candy. But without bitstreaming, Blu-ray is moot (and mute, IMO).

    For a $500+ card, it's just a shame, that's all. All of AMD's high-end cards can do it.
  • QuagmireLXIX - Sunday, November 14, 2010 - link

    Well said. There are quite a few fixes that make the 580 what I wanted in March, but the lack of bitstream is still a hard hit for what I want my PC to do.

    Call me niche.
  • QuagmireLXIX - Sunday, November 14, 2010 - link

    Actually, this is killing me. I waited for the 480 in March b4 pulling the trigger on a 5870 because I wanted HDMI to a Denon 3808 and the 480 totally dropped the ball on the sound aspect (S/PDIF connector and limited channels and all). I figured no big deal, it is a gamer card after all, so 5870 HDMI I went.

    The thing is, my PC is all-in-one (HTPC, Game & typical use). The noise and temps are not a factor as I watercool. When I read that HDMI audio got internal on the 580, I thought, finally. Then I read Guru's article and seen bitstream was hardware supported and just a driver update away, I figured I was now back with the green team since 8800GT.

    Now Ryan (thanks for the truth, I guess :) counters Gurus bitstream comment and backs it up with direct communication with NV. This blows, I had a lofty multimonitor config in mind and no bitstream support is a huge hit. I'm not even sure if I should spend the time to find out if I can arrange the monitor setup I was thinking.

    Now I might just do a HTPC rig and Game rig or see what 6970 has coming. Eyefinity has an advantage for multiple monitors, but the display-port puts a kink in my designs also.
  • Mr Perfect - Tuesday, November 9, 2010 - link

    So where do they go from here? Disable one SM again and call it a GTX570? GF104 is to new to replace, so I suppose they'll enable the last SM on it for a GTX560.

Log in

Don't have an account? Sign up now