The Test

For our look at the GTX 580 we will only be looking at single card performance. As a measure of promotion for their OEM partners, NVIDIA would only make a second GTX 580 available to us if we also agreed to review a high-end gaming system. Because the high-end system was completely unnecessary for a GPU review we declined NVIDIA’s offer, and as a result we were only offered 1 GTX 580 which you’ll be seeing here today. We will be looking at SLI performance once we can acquire a second GTX 580 farther down the line.

For our testing we’ll be using the latest version of our GPU benchmark suite, which was introduced back in our Radeon HD 6800 series review two weeks ago. We’re using the latest drivers from both AMD and NVIDIA here – Catalyst Hotfix 10.10d for AMD, and Forceware 262.99 for the NVIDIA cards.

Finally, as we mentioned earlier, AMD doesn’t have a direct competitor to the GTX 580. The closest competitors they have are dual-GPU setups in the form of the closeout 5970 and the 6870 in Crossfire. Meanwhile NVIDIA has cut GTX 470 prices so far to the bone that you can pick up a pair of them for as much as a single GTX 580. Two slightly crippled GF100 cards versus one GF110 card will not be a fair fight…

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Asus Rampage II Extreme
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 3 x 2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6870
AMD Radeon HD 6850
AMD Radeon HD 5970
AMD Radeon HD 5870
AMD Radeon HD 5850
AMD Radeon HD 5770
AMD Radeon HD 4870
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 480
NVIDIA GeForce GTX 470
NVIDIA GeForce GTX 460 1GB
NVIDIA GeForce GTX 460 768MB
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 260 Core 216
Video Drivers: NVIDIA ForceWare 262.99
AMD Catalyst 10.10d
OS: Windows 7 Ultimate 64-bit
Meet the GTX 580 Crysis: Warhead
Comments Locked

160 Comments

View All Comments

  • Sihastru - Tuesday, November 9, 2010 - link

    At this point it makes no sense to get rattled up about the 580. We must patiently wait for the 69x0 cards and see what they can bring to the table. I heard rumours of AMD delaying their cards to the end of the year in order to do some "tweaks"...
  • nitrousoxide - Tuesday, November 9, 2010 - link

    Delaying is something good because it indicates that Cayman can be very big, very fast and...very hungry making it hard to build. What AMD needs is a card that can defeat GTX580, no matter how hot or power-hungry it is.
  • GeorgeH - Tuesday, November 9, 2010 - link

    Is there any word on a fully functional GF104?

    Nvidia could call it the 560, with 5="Not Gimped".
  • Sihastru - Tuesday, November 9, 2010 - link

    I guess once GTX470 goes EOL. If GTX460 had all it's shaders enabled then the overclocked versions would have canibalized GTX470 sales. Even so, it will happen on occasion.
  • tomoyo - Tuesday, November 9, 2010 - link

    My guess is there will be GTX 580 derivatives with less cores enabled as usual, probably a GTX 570 or something. And then an improved GTX 460 with all cores enabled as the GTX 560.
  • tomoyo - Tuesday, November 9, 2010 - link

    Good to see nvidia made a noticeable improvement over the overly hot and power hungry GTX 480. Unfortunately way above my power and silence needs, but competition is a good thing. Now I'm highly curious how close the Radeon 69xx will come in performance or if it can actually beat the GTX 580 in some cases.
    Of course the GTX 480 is completely obsolete now, more power, less speed, more noise, ugly to look at.
  • 7eki - Tuesday, November 9, 2010 - link

    What we got here today is a higher clocked, better cooled GTX 480 with a slightly better power consumption. All of that for only 80$ MORE ! Any first served version of non referent GTX 480 is equipped with a much better cooling solution that gives higher OC possibilites and could kick GTX 580's ass. If we compare GTX 480 to a GTX580 clock2clock we will get about 3% of a difference in performance. All thanks to 32 CUDA processors, and a few more TMU's. How come the reviewers are NOW able to find pros of something that they used to criticise 7 months ago ? Don't forget that AMD's about to break their Sweet Spot strategy just to cut your hypocrites tongues. I bet that 6990's going to be twice as fast as what we got here today . If we really got anything cause I can't really tell the difference.
  • AnnonymousCoward - Tuesday, November 9, 2010 - link

    32W less for 15% more performance, still on 40nm, is a big deal.
  • 7eki - Wednesday, November 10, 2010 - link

    32W and 15% you say ? No it isn't a big deal since AMD's Barts GPUs release. Have on mind that GTX580 still consumes more energy than a faster (in most cases) and one year older multi GPU HD5970. In that case even 60 would sound ridiculosly funny. It's not more than a few percent improvement over GTX480. If you don't believe it calculate how much longer will you have to play on your GTX580 just to get your ~$40 spent on power consumption compared to a GTX480 back. Not to mention (again) that a nonreferent GTX480 provides much better cooling solutions and OC possibilities. Nvidia's diggin their own grave. Just like they did by releasing GTX460. The only thing that's left for them right now is to trick the reviewers. But who cares. GTX 580 isn't going to make them sell more mainstream GPUs. It isn't nvidia whos cutting HD5970 prices right now. It was AMD by releasing HD6870/50 and announcing 6970. It should have been mentioned by all of you reviewers who treat the case seriously. Nvidia's a treacherous snake and the reviewers job is not to let such things happen.
  • Sihastru - Wednesday, November 10, 2010 - link

    Have you heard about the ASUS GTX580 Voltage Tweak edition that can be clocked up to 1100 MHz, that's more then 40% OC? Have you seen the EVGA GTX580 FTW yet?

    The fact that a single GPU card is in some cases faster then a dual GPU card built with two of the fastest competing GPU's tells a lot of good things about that single GPU card.

    This "nVidia in the Antichrist" speech is getting old. Repeating it all over the interwebs doesn't make it true.

Log in

Don't have an account? Sign up now