Software: GeForce Experience, Out of Beta

Along with the launch of the GTX 780 hardware, NVIDIA is also using this opportunity to announce and roll out new software. Though they are (and always will be) fundamentally a hardware company, NVIDIA has been finding that software is increasingly important to the sales of their products. As a result the company has taken on several software initiatives over the years, both on the consumer side and the business side. To that end the products launching today are essentially a spearhead as part of a larger NVIDIA software ecosystem.

The first item on the list is GeForce Experience, NVIDIA’s game settings advisor. You may remember GeForce Experience from the launch of the GTX 690, which is when GeForce Experience was first announced. The actual rollout of GeForce Experience was slower than NVIDIA projected, having gone from an announcement to a final release in just over a year. Never the less, there is a light at the end of the tunnel and with version 1.5, GeForce Experience is finally out of beta and is being qualified as release quality.

So what is GeForce Experience? GFE is in a nutshell NVIDIA’s game settings advisor. The concept itself is not new, as games have auto-detected hardware and tried to set appropriate settings, and even NVIDIA has toyed with the concept before with their Optimal Playable Settings (OPS) service. The difference between those implementations and GFE comes down to who’s doing the work of figuring this out, and how much work is being done.

With OPS NVIDIA was essentially writing out recommended settings by hand based on human play testing. That process is of course slow, making it hard to cover a wide range of hardware and to get settings out for new games in a timely manner. Meanwhile with auto-detection built-in to games the quality of the recommendations is not a particular issue, but most games based their automatic settings around a list of profiles, which means most built-in auto-detection routines were fouled up by newer hardware. Simply put, it doesn’t do NVIDIA any good if a graphical showcase game like Crysis 3 selects the lowest quality settings because it doesn’t know what a GTX 780 is.

NVIDIA’s solution of choice is to take on most of this work themselves, and then move virtually all of it to automation. From a business perspective this makes great sense for NVIDIA as they already have the critical component for such a service, the hardware. NVIDIA already operates large GPU farms in order to test drivers, a process that isn’t all that different from what they would need to do to automate the search for optimal settings. Rather than regression testing and looking for errors, NVIDIA’s GPU farms can iterate through various settings on various GPUs in order to find the best combination of settings that can reach a playable level of performance. 

By iterating through the massive matrix of settings most games offer, NVIDIA’s GPU farms can do most of the work required. What’s left for humans is writing test cases for new games, something again necessary for driver/regression testing, and then identifying which settings are more desirable from a quality perspective so that those can be weighted and scored in the benchmarking process. This means that it’s not entirely a human-free experience, but having a handful of engineers writing test cases and assigning weights is a much more productive use of time than having humans test everything by hand like it was for OPS.

Moving on, all of this feeds into NVIDIA’s GFE backend service, which in turn feeds the frontend in the form of the GFE client. The GFE client has a number of features (which we’ll get into in a moment), but for the purposes of GFE its primary role is to find games on a user’s computer, pull optimal settings from NVIDIA, and then apply those settings as necessary. All of this is done through a relatively straightforward UI, which lists the detected games, the games’ current settings, and NVIDIA’s suggested settings.

The big question of course is whether GFE’s settings are any good, and in short the answer is yes. NVIDIA’s settings are overall reasonable, and more often than not have closely matched the settings we use for benchmarking. I’ve noticed that they do have a preference for FXAA and other pseudo-AA modes over real AA modes like MSAA, but at this point that’s probably a losing battle on my part given the performance hit of MSAA.

For casual users NVIDIA is expecting this to be a one-stop solution. Casual users will let GFE go with whatever it thinks are the best settings, and as long as NVIDIA has done their profiling right users will get the best mix of quality at an appropriate framerate. For power users on the other hand the expectation isn’t necessarily that those users will stick with GFE’s recommended settings, but rather GFE will provide a solid baseline to work from. Rather than diving into a new game blindly, power users can start with GFE’s recommended settings and then turn things down if the performance isn’t quite high enough, or adjust some settings for others if they favor a different tradeoff in quality. On a personal note this exactly matches what I’ve been using GFE for since the earlier betas landed in our hands, so it seems like NVIDIA is on the mark when it comes to power users.

With all of that said, GeForce Experience isn’t going to be a stand-alone game optimization product but rather the start of a larger software suite for consumers. GeForce Experience has already absorbed the NVIDIA Update functionality that previously existed as a small optional install in NVIDIA’s drivers. It’s from here that NVIDIA is going to be building further software products for GeForce users.

The first of these expansions will be for SHIELD, NVIDIA’s handheld game console launching next month. One of SHIELD’s major features is the ability to stream PC games to the console, which in turn requires a utility running on the host PC to provide the SHIELD interface, control mapping, and of course video encoding and streaming. Rather than roll that out as a separate utility, that functionality will be built into future versions of GeForce Experience.

To that end, with the next release of drivers for the GTX 780 GeForce Experience will be bundled with NVIDIA’s drivers, similar to how NVIDIA Update is today. Like NVIDIA Update it will be an optional-but-default item, so users can opt out of it, but if the adoption is anything like NVIDIA Update then the expectation is that most users will end up installing GFE.

It would be remiss of us to not point out the potential for bloat here, but we’ll have to see how this plays out. In terms of file size GeForce Experience is rather tiny at 11MB (versus 169MB for the 320.14 driver package), so after installer overhead is accounted for it should add very little to the size of the GeForce driver package. Similarly it doesn’t seem to have any real appetite for system resources, but this is the wildcard since it’s subject to change as NVIDIA adds more functionality to the client.

Meet The GeForce GTX 780, Cont Software, Cont: ShadowPlay and "Reason Flags"
Comments Locked

155 Comments

View All Comments

  • varad - Thursday, May 23, 2013 - link

    You do realize that a GPU like Titan has almost 5 times the number of transistors compared to Intel's biggest Core i7 CPU, right? There are 7.1 billion transistors in Titan vs 1.4 billion in Core i7 3770k. So, it means they cannot match the price of "a good CPU" unless they decide to become a non-profit organization :)
  • AssBall - Thursday, May 23, 2013 - link

    Well if all you needed was a single titan to run your is, computations, games, and nothing else, then no problem.
  • krutou - Sunday, May 26, 2013 - link

    Two problems with your logic

    22 nm fabrication is more expensive (price per transistor)

    CPUs are more difficult to design
  • An00bis - Friday, May 31, 2013 - link

    it's not like you can just shove your hand in a jar full of transistors and just slap it on a chip and consider it a cpu, a cpu is required to do a gpu's task (integrated gpu) AND be good at everything a gpu can't do, which is... well lots of things actually. A gpu is much simpler, hence why the manufacturing + designing cost is probably more expensive than a big ass card that has to include memory+a pcb+a gpu
  • chizow - Thursday, May 23, 2013 - link

    Great card, but a year late. This is what GTX 600 series should've been but we all know how that went.

    I think Nvidia made some pretty big mistakes with how they handled the entire Kepler generation after Tahiti's launch price debacle. I know their financial statements and stockholders don't agree but they've managed to piss off their core consumers at every performance segment.

    Titan owners have to feel absolutely gutted at this point having paid $1000 for a part that is only ~10-15% faster than the GTX 780. End result of this generation is we are effectively paying 50-100% more for the same class of card than previous generations. While the 780 is a great card and a relatively good value compared to Titan, we're still paying $650 for what is effectively Kepler's version of the GTX 470.
  • Crisium - Thursday, May 23, 2013 - link

    People who bought a Titan knew what they were getting into. If you have regrets, you were in no position to buy a $1000 GPU to begin with and made a grievous financial error.

    $650 isn't horrible for this price, but you are still paying the Nvidia Tax.
  • chizow - Thursday, May 23, 2013 - link

    I don't think so, if you polled GTX Titan owners if they would've paid $1000 knowing 2-3 months later there would be a part that performed similarly at 35% less price, I think you would hear most of them would've waited to buy not 1, but 2 for just a bit more. Or instead of buying 2 Titans, buying 3x780s.

    Also, it really has nothing to do with being in a financial position or not, it's funny when Titan released I made the comment anyone interested in Titan would be better served by simply investing that money into Nvidia stock, letting that money grow on Titan's fat margins, and then buying 2x780s when they released. All according to plan, for my initial investment of 1 Titan I can buy 2x780s.

    But I won't. Nvidia blew it this generation, I'll wait for Maxwell.
  • IanCutress - Thursday, May 23, 2013 - link

    Titan was a compute card with optional gaming, rather than a gaming card with optional FP64 compute. That's why the price difference exists. If you bought a Titan card for Gaming, then you would/should have been smart enough to know that a similar card without compute was around the corner.
  • chizow - Thursday, May 23, 2013 - link

    Unfortunately, that was never how *GTX* Titan was marketed, straight from the horses mouth:
    "With the DNA of the world’s fastest supercomputer and the soul of NVIDIA® Kepler™ architecture, GeForce® GTX TITAN GPU is a revolution in PC gaming performance."

    Not to mention the fact Titan is a horrible compute card and value outside of CUDA workloads, and even there it suffers as a serious compute card due to the lack of ECC. It's an overpriced gaming card, plain and simple.

    At the time, it was still uncertain whether or not Nvidia would launch more SKUs based on GK110 ASIC, but informed consumers knew Nvidia had to do something with all the chips that didn't make the TDP cut as Tesla parts.
  • mayankleoboy1 - Thursday, May 23, 2013 - link

    Really ? Apart from a few apps, Titan is poor compared to a 7970. It has bad OpenGL performance, which 90% of industry renderfarms use.
    Titan is really an overpriced gaming card.

Log in

Don't have an account? Sign up now