GeForce Experience & The Test

Before jumping into our test results, there’s one last thing we wanted to touch upon quickly. Along with announcing the GTX 690 at the NVIDIA Gaming Festival 2012, NVIDIA also used the occasion to announce a new software utility called GeForce Experience.

For some time now NVIDIA has offered a feature they call Optimal Playable Settings through GeForce.com, which are a series of game setting configurations that NVIDIA has tested and is recommending for various GeForce video cards. It’s a genuinely useful service, but it’s also not well known and only covers desktop GPUs.

With GeForce Experience NVIDIA is going to be taking that concept one step further and offering an application that interfaces with both the game and the successor to NVIDIA’s OPS service. The key difference being that rather than having the settings on a website and requiring the user to punch in those settings by hand, GeForce Experience can fetch those settings from NVIDIA and make the settings changes on its own. This would make the process much more accessible, as not only do users not need to know anything about how to access their settings or what they do, but the moment NVIDIA includes this with their drivers it will be far more widespread than OPS ever was.

The other change is that NVIDIA is going to be moving away from manual testing in favor of automated testing. OPS are generated by hand, whereas GeForce Experience settings are going to be based on automated testing, allowing NVIDIA to cover a wider range of games and video cards, most importantly by including mobile video cards. NVIDIA already has GPU farms for driver regression testing, so this is a logical extension of that concept to use those farms to generate and test game settings.

GeForce Experience will be launching in beta form on June 6th.

The Test

The press drivers for the GTX 690 are 301.33, though it sounds like NVIDIA will actually launch with a slightly newer version today. As the GTX 690 is launching so soon after the GTX 680 these drivers are virtually identical to the GTX 680 launch drivers. Meanwhile for the GeForce 500 series we’re using 301.24, and for the AMD Radeon cards Catalyst 12.4

We’d also like to give a shout-out to Asus, who sent us one of their wonderful PA246Q 24” P-IPS monitors to allow us to complete our monitor set for multi-monitor testing. From here on we’ll be able to offer multi-monitor results for our high-end cards, and a number of cards have already had that data added in Bench.

Next, based on an informal poll on our forums we’re going to be continuing our existing SLI/CF testing methodology. All of our test results will be with both cards directly next to each other as opposed to spaced apart in order to test the worst case scenario. Users with such a configuration are a minority based on our data, but there are still enough of them that we believe it should be covered.

Finally, we’d like to note that since we don’t have a matching pair of 7970 reference cards, we’re using our one reference card along with XFX’s R7970 BEDD. For gaming performance, power consumption, and temperatures this doesn’t have a material impact, but it means we don’t have meaningful noise performance for the 7970.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Asus PA246Q
Video Cards: AMD Radeon HD 7970
AMD Radeon HD 6990
AMD Radeon HD 6970
AMD Radeon HD 5970
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 590
NVIDIA GeForce GTX 580
Video Drivers: NVIDIA ForceWare 301.24
NVIDIA ForceWare 301.33
AMD Catalyst 12.4
OS: Windows 7 Ultimate 64-bit

 

Overclocking Crysis: Warhead
Comments Locked

200 Comments

View All Comments

  • InsaneScientist - Sunday, May 6, 2012 - link

    Or don't...

    It's 2 days later, and you've been active in the comments up through today. Why'd you ignore this one, Cerise?
  • CeriseCogburn - Sunday, May 6, 2012 - link

    Because you idiots aren't worth the time and last review the same silverblue stalker demanded the links to prove my points and he got them, and then never replied.
    It's clear what providing proof does for you people, look at the sudden 100% ownership of 1920x1200 monitors..
    ROFL
    If you want me to waste my time, show a single bit of truth telling on my point on the first page.
    Let's see if you pass the test.
    I'll wait for your reply - you've got a week or so.
  • KompuKare - Thursday, May 3, 2012 - link

    It is indeed sad. AMD comes up with really good hardware features like eyefinity but then never polishes up the drivers properly. Looking some of crossfire results is sad too: in Crysis and BF3 CF scalling is better than SLI (unsure but I think the trifire and quadfire results for those games are even more in AMD's favour), but in Skyrim it seems that CF is totally broken.

    Of course compared to Intel, AMD's drivers are near perfect but with a bit more work they could be better than Nvidia's too rather than being mostly at 95% or so.

    Tellingly, JHH did once say that Nvidia were a software company which was a strange thing for a hardware manufacturer to say. But this also seems to mean that they forgotten the most basic primary thing which all chip designers should know: how to design hardware which works. Yes I'm talking about bumpgate.

    See despite all I said about AMD's drivers, I will never buy Nvidia hardware again after my personal experience of their poor QA. My 8800GT, my brother's 8800GT, this 8400M MXM I had, plus number of laptops plus one nForce motherboard: they all had one thing in common, poorly made chips made by BigGreen and they all died way before they were obsolete.

    Oh, and as pointed out in the Anand VC&G forums earlier today:

    "Well, Nvidia has the title of the worst driver bug in history at this point-
    http://www.zdnet.com/blog/hardware/w...hics-card/7... "

    killing cards with a driver is a record.
  • Filiprino - Thursday, May 3, 2012 - link

    Yep, that's true. They killed cards with a driver. They should implement hardware auto shutdown, like CPUs. As for the nForce, I had one motherboard, the best nForce they made: nForce 2 for AMD Athlon. The rest of mobo chipsets were bullshit, including nForce 680.

    The QA I don't think is NVIDIA's fault but videocard manufacturers.
  • KompuKare - Thursday, May 3, 2012 - link


    The QA I don't think is NVIDIA's fault but videocard manufacturers.


    No, 100% Nvidia's fault. Although maybe QA isn't the right word. I was referring to Nvidia using the wrong solder underfil for a few million chips (the exact number is unknown): they were mainly mobile parts and Nvidia had to put $250 million aside to settle a class action.

    http://en.wikipedia.org/wiki/GeForce_8_Series#Prob...

    Although that wiki article is rather lenient towards Nvidia since that bit about fan speeds is red herring: more accurately it was Nvidia which spec'ed their chips to a certain temperature and designs which run way below that will have put less stress on the solder but to say it was poor OEM and AIB design which lead to the problem is not correct. Anyway, the proper expose was by Charlie D. in the Inquirer and later SemiAccurate
  • CeriseCogburn - Friday, May 4, 2012 - link

    But in fact it was a bad heatsink design, thank HP, and view the thousands of heatsink repairs, including the "add a copper penny" method to reduce the giant gap between the HS and the NV chip.
    Charlie was wrong, a liar, again, as usual.
  • KompuKare - Friday, May 4, 2012 - link

    Don't be silly. While HP's DV6000s were the most notorious failures and that was due to HP's poorly designed heatsink / cooling bumpgate also saw Dells, Apples and others:

    http://www.electronista.com/articles/10/09/29/suit...
    http://www.nvidiadefect.com/nvidia-settlement-t874...

    The problem was real, continues to be real and also affects G92 desktop parts and certain nForce chipsets like the 7150.

    Yes, the penny shim trick will fix it for a while but if you actually were to read up on technicians forums who fix laptops, that plus reflows are only a temporary fix because the actual chips are flawed. Re-balling with new, better solder is a better solution but not many offer those fixes since it involves 100s of tiny solder balls per chip.

    Before blindly leaping to Nvidia's defence like a fanboy, please do some research!
  • CeriseCogburn - Saturday, May 5, 2012 - link

    Before blindly taking the big lie from years ago repeated above to attack nvidia for no reason at all other than all you have is years old misinformation, then wail on about it, while telling someone else some more lies about it, check your own immense bias and lack of knowledge, since I had to point out the truth for you to find, and you forgot DV9000, dv2000 and dell systems with poor HS design, let alone apple amd console video chip failings, and the fact that payment was made and restitution was delivered, which you also did not mention, because of your fanboy problems, obviously in amd's favor.
  • Ashkal - Thursday, May 3, 2012 - link

    In price comparison in Final words you are not referring with AMD products. I think AMD is better in price performance ratio.
  • prophet001 - Thursday, May 3, 2012 - link

    I agree

Log in

Don't have an account? Sign up now