GeForce Experience & The Test

Before jumping into our test results, there’s one last thing we wanted to touch upon quickly. Along with announcing the GTX 690 at the NVIDIA Gaming Festival 2012, NVIDIA also used the occasion to announce a new software utility called GeForce Experience.

For some time now NVIDIA has offered a feature they call Optimal Playable Settings through GeForce.com, which are a series of game setting configurations that NVIDIA has tested and is recommending for various GeForce video cards. It’s a genuinely useful service, but it’s also not well known and only covers desktop GPUs.

With GeForce Experience NVIDIA is going to be taking that concept one step further and offering an application that interfaces with both the game and the successor to NVIDIA’s OPS service. The key difference being that rather than having the settings on a website and requiring the user to punch in those settings by hand, GeForce Experience can fetch those settings from NVIDIA and make the settings changes on its own. This would make the process much more accessible, as not only do users not need to know anything about how to access their settings or what they do, but the moment NVIDIA includes this with their drivers it will be far more widespread than OPS ever was.

The other change is that NVIDIA is going to be moving away from manual testing in favor of automated testing. OPS are generated by hand, whereas GeForce Experience settings are going to be based on automated testing, allowing NVIDIA to cover a wider range of games and video cards, most importantly by including mobile video cards. NVIDIA already has GPU farms for driver regression testing, so this is a logical extension of that concept to use those farms to generate and test game settings.

GeForce Experience will be launching in beta form on June 6th.

The Test

The press drivers for the GTX 690 are 301.33, though it sounds like NVIDIA will actually launch with a slightly newer version today. As the GTX 690 is launching so soon after the GTX 680 these drivers are virtually identical to the GTX 680 launch drivers. Meanwhile for the GeForce 500 series we’re using 301.24, and for the AMD Radeon cards Catalyst 12.4

We’d also like to give a shout-out to Asus, who sent us one of their wonderful PA246Q 24” P-IPS monitors to allow us to complete our monitor set for multi-monitor testing. From here on we’ll be able to offer multi-monitor results for our high-end cards, and a number of cards have already had that data added in Bench.

Next, based on an informal poll on our forums we’re going to be continuing our existing SLI/CF testing methodology. All of our test results will be with both cards directly next to each other as opposed to spaced apart in order to test the worst case scenario. Users with such a configuration are a minority based on our data, but there are still enough of them that we believe it should be covered.

Finally, we’d like to note that since we don’t have a matching pair of 7970 reference cards, we’re using our one reference card along with XFX’s R7970 BEDD. For gaming performance, power consumption, and temperatures this doesn’t have a material impact, but it means we don’t have meaningful noise performance for the 7970.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Asus PA246Q
Video Cards: AMD Radeon HD 7970
AMD Radeon HD 6990
AMD Radeon HD 6970
AMD Radeon HD 5970
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 590
NVIDIA GeForce GTX 580
Video Drivers: NVIDIA ForceWare 301.24
NVIDIA ForceWare 301.33
AMD Catalyst 12.4
OS: Windows 7 Ultimate 64-bit

 

Overclocking Crysis: Warhead
Comments Locked

200 Comments

View All Comments

  • CeriseCogburn - Thursday, May 10, 2012 - link

    The GTX680 by EVGA in a single sku outsells the combined total sales of the 7870 and 7850 at newegg.
    nVidia "vaporware" sells more units than the proclaimed "best deal" 7000 series amd cards.
    ROFL
    Thanks for not noticing.
  • Invincible10001 - Sunday, May 13, 2012 - link

    Maybe a noob question, but can we expect a mobile version of the 690 on laptops anytime soon?
  • trumpetlicks - Thursday, May 24, 2012 - link

    Compute performance in this case may have to do with 2 things:
    - Amount of memory available for the threaded computational algorithm being run, and
    - the memory IO throughput capability.

    From the rumor-mill, the next NVidia chip may contain 4 GB per chip and a 512 bit bus (which is 2x larger than the GK104).

    If you can't feed the beast as fast as it can eat it, then adding more cores won't increase your overall performance.
  • Joseph Gubbels - Tuesday, May 29, 2012 - link

    I am a new reader and equally new to the subject matter, so sorry if this is a dumb question. The second page mentioned that NVIDIA will be limiting its partners' branding of the cards, and that the first generation of GTX 690 cards are reference boards. Does NVIDIA just make a reference design that other companies use to make their own graphics cards? If not, then why would anyone but NVIDIA have any branding on the cards?
  • Dark0tricks - Saturday, June 2, 2012 - link

    anyone who sides with AMD or NVIDIA are retards - side with yourself as a consumer - buy the best card at the time that is available AND right for your NEEDs.

    fact is the the 690 is trash regardless of whether you are comparing it to a NVIDIA card to a AMD card - if im buying a card like a 690 why the FUCK would i want anything below 1200 P
    even if it is uncommon its a mfing trash of a $1000 card considering:

    $999 GeForce GTX 690
    $499 GeForce GTX 680
    $479 Radeon HD 7970

    and that SLI and CF both beat(or equal) the 690 at higher res's and cost less(by 1$ for NVIDIA but still like srsly wtf NVIDIA !? and 40$ for AMD) ... WHAT !?

    furthermore you guys fighting over bias when the WHOLE mfing GFX community (companies, software developers is built on bias) is utterly ridiculous, GFX vendoers (AMD and NVIDA) have skewed results for games for the last decade + , and software vendors two - there needs to laws against specfically building a software for a particular graphics card in addition to making the software work worse on the other (this applies to both companies)

    hell workstation graphics cards are a very good example of how the industry likes to screw over consumers ( if u ever bios modded - not just soft modded a normal consumer card to a work station card , you would know all that extra charge(up-to 70% extra for the same processor) of a workstation card is BS and if the government cleaned up their shitty policies we the consumer would be better for it)
  • nyran125 - Monday, June 4, 2012 - link

    yep........

    Ultra expensive and Ultra pointless.
  • kitty4427 - Monday, August 20, 2012 - link

    I can't seem to find anything suggesting that the beta has started...
  • trameaa - Friday, March 1, 2013 - link

    I know this is a really old review, and everyone has long since stopped the discussion - but I just couldn't resist posting something after reading through all the comments. Understand, I mean no disrespect to anyone at all by saying this, but it really does seem like a lot of people haven't actually used these cards first hand.

    I see all this discussion of nVidia surround type setups with massive resolutions and it makes me laugh a little. The 690 is obviously an amazing graphics card. I don't have one, but I do use 2x680 in SLI and have for some time now.

    As a general rule, these cards have nowhere near the processing power necessary to run those gigantic screen resolutions with all the settings cranked up to maximum detail, 8xAA, 16xAF, tessellation, etc....

    In fact, my 680 SLI setup can easily be running as low as 35 fps in a game like Metro 2033 with every setting turned up to max - and that is at 1920x1080.

    So, for all those people that think buying a $1000 graphics card means you'll be playing every game out there with every setting turned up to max across three 1920x1200 displays - I promise you, you will not - at least not at a playable frame rate.

    To do that, you'll be realistically looking at 2x$1000 graphics cards, a ridiculous power supply, and by the way you better make sure you have the processing power to push those cards. Your run of the mill i5 gaming rig isn't gonna cut it.
  • Utomo - Friday, October 25, 2013 - link

    More than 1 year since it is announced. I hope new products will be better. My suggestion: 1 Add HDMI, it is standard. 2. consider to allow us to add memory / SSD for better/ faster performance, especially for rendering 3D animation, and other
  • TPLVG - Sunday, March 5, 2017 - link

    GTX 690 in known as "The nuclear bomb" in the Chinese IT communities because its power consumption and temperature.

Log in

Don't have an account? Sign up now