GeForce Experience & The Test

Before jumping into our test results, there’s one last thing we wanted to touch upon quickly. Along with announcing the GTX 690 at the NVIDIA Gaming Festival 2012, NVIDIA also used the occasion to announce a new software utility called GeForce Experience.

For some time now NVIDIA has offered a feature they call Optimal Playable Settings through GeForce.com, which are a series of game setting configurations that NVIDIA has tested and is recommending for various GeForce video cards. It’s a genuinely useful service, but it’s also not well known and only covers desktop GPUs.

With GeForce Experience NVIDIA is going to be taking that concept one step further and offering an application that interfaces with both the game and the successor to NVIDIA’s OPS service. The key difference being that rather than having the settings on a website and requiring the user to punch in those settings by hand, GeForce Experience can fetch those settings from NVIDIA and make the settings changes on its own. This would make the process much more accessible, as not only do users not need to know anything about how to access their settings or what they do, but the moment NVIDIA includes this with their drivers it will be far more widespread than OPS ever was.

The other change is that NVIDIA is going to be moving away from manual testing in favor of automated testing. OPS are generated by hand, whereas GeForce Experience settings are going to be based on automated testing, allowing NVIDIA to cover a wider range of games and video cards, most importantly by including mobile video cards. NVIDIA already has GPU farms for driver regression testing, so this is a logical extension of that concept to use those farms to generate and test game settings.

GeForce Experience will be launching in beta form on June 6th.

The Test

The press drivers for the GTX 690 are 301.33, though it sounds like NVIDIA will actually launch with a slightly newer version today. As the GTX 690 is launching so soon after the GTX 680 these drivers are virtually identical to the GTX 680 launch drivers. Meanwhile for the GeForce 500 series we’re using 301.24, and for the AMD Radeon cards Catalyst 12.4

We’d also like to give a shout-out to Asus, who sent us one of their wonderful PA246Q 24” P-IPS monitors to allow us to complete our monitor set for multi-monitor testing. From here on we’ll be able to offer multi-monitor results for our high-end cards, and a number of cards have already had that data added in Bench.

Next, based on an informal poll on our forums we’re going to be continuing our existing SLI/CF testing methodology. All of our test results will be with both cards directly next to each other as opposed to spaced apart in order to test the worst case scenario. Users with such a configuration are a minority based on our data, but there are still enough of them that we believe it should be covered.

Finally, we’d like to note that since we don’t have a matching pair of 7970 reference cards, we’re using our one reference card along with XFX’s R7970 BEDD. For gaming performance, power consumption, and temperatures this doesn’t have a material impact, but it means we don’t have meaningful noise performance for the 7970.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Asus PA246Q
Video Cards: AMD Radeon HD 7970
AMD Radeon HD 6990
AMD Radeon HD 6970
AMD Radeon HD 5970
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 590
NVIDIA GeForce GTX 580
Video Drivers: NVIDIA ForceWare 301.24
NVIDIA ForceWare 301.33
AMD Catalyst 12.4
OS: Windows 7 Ultimate 64-bit

 

Overclocking Crysis: Warhead
POST A COMMENT

200 Comments

View All Comments

  • bobsmith1492 - Thursday, May 03, 2012 - link

    It's not that rare; I got a fairly inexpensive 24" 1920x1200 HP monitor from Newegg a year ago. There weren't many options but it was there and it's great. Reply
  • a5cent - Thursday, May 03, 2012 - link

    You are right that the average Joe doesn't have a 1920x1200 monitor, but this is an enthusiast web-site! Not a single enthusiast I know owns a 1080 display. 1920x1200 monitors aren't hard to find, but you will need to spend a tad more. Reply
  • CeriseCogburn - Saturday, May 05, 2012 - link

    Nope, 242 vs 16 is availability, you lose miserably. You all didn't suddenly have one along with your "friends" you suddenly acquired and have memorized their monitor sizes instantly as well.
    ROFL - the lies are innumerable at this point.
    Reply
  • UltraTech79 - Thursday, May 03, 2012 - link

    They make up about 10% stock. I wouldn't call that very rare. Newegg and other places have a couple dozen+ to choose from.

    Maybe YOU dont buy very much.
    Reply
  • CeriseCogburn - Tuesday, May 08, 2012 - link

    Closer to 5% than it is to 10%, and they cost a lot more for all the moaning penny pinchers who've suddenly become flush. Reply
  • Digimonkey - Thursday, May 03, 2012 - link

    It's either 1920x1200 @ 60hz, or 1920x1080 @ 120hz. I prefer smoother gameplay over 120 pixels. Also I know quite a few gamers that like using their TV for their PC gaming, so this would also be limited to 1080p. Reply
  • CeriseCogburn - Friday, May 04, 2012 - link

    No one here is limited, they all said, so no one uses their big screens, they all want it @ 1200P now because amd loses not so badly there...
    ROFL
    Reply
  • Dracusis - Thursday, May 03, 2012 - link

    I'm be more worried about AMD's performance going down in certain games due to Crossfire than something as trival as this. As a 4870X2 owner I can tell you this is not at all uncommon for AMD. I still have to disable 1 GPU in most games, including BF3, because AMDs drivers for any card more than 12 months old are just terrible. As you can see even the 6990 is being beat by a 6970 in games as modern as Skyrim - their drivers are just full of fail. Reply
  • Galidou - Thursday, May 03, 2012 - link

    A much higher percentage?!? that's 7% more... nothing extraordinary...Let's just say a higher percentage, when you say much, it makes us beleive Nvidia's paying you. Reply
  • CeriseCogburn - Saturday, May 05, 2012 - link

    10% you might be able to ignore, 17% you cannot. It's much higher, it changes several of the games here as to who wins in the article in the accumulated benches.
    It's a big difference.
    Reply

Log in

Don't have an account? Sign up now