OCP Refined, A Word On Marketing, & The Test

As you may recall from our GTX 580 launch article, NVIDIA added a rudimentary OverCurrent Protection (OCP) feature to the GTX 500 series. At the time of the GTX 580 launch, OCP would clamp down on Furmark and OCCT keep those programs from performing at full speed, as the load they generated was so high that it risked damaging the card. As a matter of principle we have been disabling OCP in all of our tests up until now as OCP was only targeting Furmark and OCCT, meaning it didn’t provide any real protection for the card in any other situations. Our advice to NVIDIA at the time was to expand it to cover the hardware at a generic level, similar to how AMD’s PowerTune operates.

We’re glad to report that NVIDIA has taken up at least some of our advice, and that OCP is taking its first step forward since then. Starting with the ForceWare 267 series drivers, NVIDIA is now using OCP at all times, meaning OCP now protects against any possible program that would generate an excessive load (as defined by NVIDIA), and not just Furmark and OCCT. At this time there’s definitely still a driver component involved as NVIDIA still throttles Furmark and OCCT right off the bat, but everything else seems to be covered by their generic detection methods.

At this point our biggest complaint is that OCP’s operation is still not transparent to the end user. If you trigger it you have no way of knowing unless you know how the game/application should already be performing. NVIDIA tells us that at some point this will be exposed through NVIDIA’s driver API, but today is not that day. Along those lines, at least in the case of Furmark and OCCT OCP still throttles to an excessive degree—whereas AMD gets this right and caps anything and everything at the PowerTune limit, we still see OCP heavily clamp these programs to the point that our GTX 590 draws 100W more under games than it does under Furmark. Clamping down on a program to bring power consumption down to safe levels is a good idea, but clamping down beyond that just hurts the user and we hope to see NVIDIA change this.

Finally, the expansion of OCP’s capabilities is going to have an impact on overclocking. As with reporting when OCP is active, NVIDIA isn’t being fully transparent here so there’s a bit of feeling around at the moment. The OCP limit for any card is roughly 25% higher than the official TDP, so in the case of the GTX 590 this would translate into a 450W limit. This limit cannot currently be changed by the end user, so overclocking—particularly overvolting—risks triggering OCP. Depending on how well its generic detection mode works, it may limit extreme overclocking on all NVIDIA cards with the OCP hardware at the moment. Even in our own overclock testing we have some results that may be compromised by OCP, so it’s definitely something that needs to be considered.

Moving on, I’d like to hit upon marketing quickly. Normally the intended market and uses of most video cards is rather straightforward. For the 6990 AMD pushed raw performance Eyefinity (particularly 5x1P), while for the GTX 590 NVIDIA is pushing raw performance, 3D Vision Surround, and PhysX (even if dedicating a GF110 to PhysX is overkill). However every now and then something comes across that catches my eye. In this case NVIDIA is also using the GTX 590 to reiterate their support for SuperSample Anti-Aliasing support for DX9, DX10, and DX11. SSAA is easily the best hidden feature of the GTX 400/500 series, and it’s one that doesn’t get much marketing attention from NVIDIA. So it’s good to see it getting some attention from NVIDIA—certainly there’s no card better suited for it than the GTX 590.

Last, but not least, we have the test. For the launch of the GTX 590 NVIDIA is providing us with ForceWare 267.71 beta, which adds support for the GTX 590; there are no other significant changes. For cooling purposes we have removed the case fan behind PEG1 on our test rig—while an 11” card is short enough to fit it, it’s counterproductive for a dual-exhaust design. Finally, in order to better compare the GTX 590 to the 6990’s OC/Uber mode, we’ve given our GTX 590 a slight overclock. Our GTX 590 OC is clocked at 750/900, a 143MHz (23%) core and 47MHz (5%) memory overclock. Meanwhile the core voltage was raised from 0.912v to 0.987v. With the poor transparency of OCP’s operation however, we are not 100% confident that we haven’t triggered OCP, so please keep that in mind when looking at the overclocked results.

Update: April 2nd, 2011: Starting with the 267.91 drivers and release 270 drivers, NVIDIA has disabled overvolting on the GTX 590 entirely. This is likely a consequence of several highly-publicized incidents where GTX 590 cards died as a result of overvolting. Although it's unusual to see a card designed to not be overclockable, clearly this is where NVIDIA intends to be.

As an editorial matter we never remove anything from a published article so our GTX 590 OC results will remain. However with these newer drivers it is simply not possible to attain them.

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Asus Rampage II Extreme
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 3x2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6990
AMD Radeon HD 6970
AMD Radeon HD 6950 2GB
AMD Radeon HD 6870
AMD Radeon HD 6850
AMD Radeon HD 5970
AMD Radeon HD 5870
AMD Radeon HD 5850
AMD Radeon HD 5770
AMD Radeon HD 4870X2
AMD Radeon HD 4870
EVGA GeForce GTX 590 Classified
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 570
NVIDIA GeForce GTX 560 Ti
NVIDIA GeForce GTX 480
NVIDIA GeForce GTX 470
NVIDIA GeForce GTX 460 1GB
NVIDIA GeForce GTX 460 768MB
NVIDIA GeForce GTS 450
NVIDIA GeForce GTX 295
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 260 Core 216
Video Drivers: NVIDIA ForceWare 262.99
NVIDIA ForceWare 266.56 Beta
NVIDIA ForceWare 266.58
NVIDIA ForceWare 267.71
AMD Catalyst 10.10e
AMD Catalyst 11.1a Hotfix
AMD Catalyst 11.4 Preview
OS: Windows 7 Ultimate 64-bit

 

Meet The EVGA GeForce GTX 590 Classified, Cont. Crysis: Warhead
Comments Locked

123 Comments

View All Comments

  • Ruger22C - Thursday, March 24, 2011 - link

    Don't spew nonsense to the people reading this! Write a disclaimer if you're going to do that.
  • The Finale of Seem - Saturday, March 26, 2011 - link

    Um...no. For one, HUD elements tend to shrink in physical size as resolution rises, meaning that games with a lot of HUD (WoW comes to mind) benefit by letting you see more of what's going on, which means that 720p is pretty friggin' awful. For two, 1920x1080 has become the standard for most monitors over 21" or so, and a lot of gamers get 1920x1080 displays, especially if they're also watching 1080p video or doing significant multitasking. Non-native resolutions look like ass, and as such, 1600x1050 is right out as you won't want to play at anything but 1920x1080.

    Now, you can say that there isn't much point going above that, and right now, that may be so as cost is pretty prohibitive, but that may not always be the case.
  • rav55 - Thursday, March 31, 2011 - link

    What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

    Basically the GTX 590 is vapourware!!! What a joke!
  • wellortech - Thursday, March 24, 2011 - link

    Reviews seem to still agree that 6950CF or 570 SLI are just as powerful, and much less expensive. Guess I'll be keeping my pair of 6950s while continuing to enjoy 30" 2550x1600 heaven.
  • DanNeely - Thursday, March 24, 2011 - link

    Yeah, these only really make sense if you're going for a 4GPU setup in an ATX box, or have a larger mATX case and want to 2 GPUs and some other card.
  • jfelano - Thursday, March 24, 2011 - link

    You go boy. I'll continue to have a life.
  • The_Comfy_Chair - Thursday, March 24, 2011 - link

    Get over yourself.

    YOU are trolling on a forum about a video card on a tech-geek site on the internet. You have no more of a life than wellortech or anyone else here - self included.
  • ShumOSU - Thursday, March 24, 2011 - link

    You're 16,000 pixels short. :-)
  • egandt - Thursday, March 24, 2011 - link

    Would have been better to see what these cards did with 3x 1920x1200 displays, as obviously they are overkill for any single display.
  • Dudler - Thursday, March 24, 2011 - link

    Couldn't agree more, but since we know from the 1,5 GB 580 that the nVida card do poorly in higher resolutions, AnandTech is probably never test any such setup. Expect 12x10 instead, as nVidia tends to do better in low resolutions than Amd. 19x12 is already irrelevant with these cards.

Log in

Don't have an account? Sign up now