New Drivers From NVIDIA Change The Landscape

Today, NVIDIA will release it's new 185 series driver. This driver not only enables support for the GTX 275, but affects performance in parts across NVIDIA's lineup in a good number of games. We retested our NVIDIA cards with the 185 driver and saw some very interesting results. For example, take a look at before and after performance with Race Driver: GRID.

As we can clearly see, in the cards we tested, performance decreased at lower resolutions and increased at 2560x1600. This seemed to be the biggest example, but we saw flattened resolution scaling in most of the games we tested. This definitely could affect the competitiveness of the part depending on whether we are looking at low or high resolutions.

Some trade off was made to improve performance at ultra high resolutions at the expense of performance at lower resolutions. It could be a simple thing like creating more driver overhead (and more CPU limitation) to something much more complex. We haven't been told exactly what creates this situation though. With higher end hardware, this decision makes sense as resolutions lower than 2560x1600 tend to perform fine. 2560x1600 is more GPU limited and could benefit from a boost in most games.

Significantly different resolution scaling characteristics can be appealing to different users. An AMD card might look better at one resolution, while the NVIDIA card could come out on top with another. In general, we think these changes make sense, but it might be nicer if the driver automatically figured out what approach was best based on the hardware and resolution running (and thus didn't degrade performance at lower resolutions).

In addition to the performance changes, we see the addition of a new feature. In the past we've seen the addition of filtering techniques, optimizations, and even dynamic manipulation of geometry to the driver. Some features have stuck and some just faded away. One of the most popular additions to the driver was the ability to force Full Screen Antialiasing (FSAA) enabling smoother edges in games. This features was more important at a time when most games didn't have an in-game way to enable AA. The driver took over and implemented AA even on games that didn't offer an option to adjust it. Today the opposite is true and most games allow us to enable and adjust AA.

Now we have the ability to enable a feature, which isn't available natively in many games, that could either be loved or hated. You tell us which.

Introducing driver enabled Ambient Occlusion.

What is Ambient Occlusion you ask? Well, look into a corner or around trim or anywhere that looks concave in general. These areas will be a bit darker than the surrounding areas (depending on the depth and other factors), and NVIDIA has included a way to simulate this effect in it's 185 series driver. Here is an example of what AO can do:

Here's an example of what AO generally looks like in games:

This, as with other driver enabled features, significantly impacts performance and might not be able to run on all games or at all resolutions. Ambient Occlusion may be something some gamers like and some do not depending on the visual impact it has on a specific game or if performance remains acceptable. There are already games that make use of ambient occlusion, and some games that NVIDIA hasn't been able to implement AO on.

There are different methods to enable the rendering of an ambient occlusion effect, and NVIDIA implements a technique called Horizon Based Ambient Occlusion (HBAO for short). The advantage is that this method is likely very highly optimized to run well on NVIDIA hardware, but on the down side, developers limit the ultimate quality and technique used for AO if they leave it to NVIDIA to handle. On top of that, if a developer wants to guarantee that the feature work for everyone, they would need implement it themselves as AMD doesn't offer a parallel solution in their drivers (in spite of the fact that they are easily capable of running AO shaders).

We haven't done extensive testing with this feature yet, either looking for quality or performance. Only time will tell if this addition ends up being gimmicky or really hits home with gamers. And if more developers create games that natively support the feature we wouldn't even need the option. But it is always nice to have something new and unique to play around with, and we are happy to see NVIDIA pushing effects in games forward by all means possible even to the point of including effects like this in their driver.

In our opinion, lighting effects like this belong in engine and game code rather than the driver, but until that happens it's always great to have an alternative. We wouldn't think it a bad idea if AMD picked up on this and did it too, but whether it is more worth it to do this or spend that energy encouraging developers to adopt this and comparable techniques for more complex writing is totally up to AMD. And we wouldn't fault them either way.

Index The Cards and The Test
Comments Locked

294 Comments

View All Comments

  • papapapapapapapababy - Thursday, April 2, 2009 - link

    Is the quality of the drivers. ATI. Call me when: A) you fix your broken drivers. B) Decide to finally ditch that bloated Microsoft Visual C++ just so i can have the enormous privilege of using your- also terrible and also bloated- CCC panel. c) Stop pouting my pc with your useless extra services. Until then 'll carry on with NVIDIA. Thanks. > Happy- nvidia- user (and frustrated ex-ATI costumer)
  • josh6079 - Thursday, April 2, 2009 - link

    The quality of drivers can be argued either way and the negative connotations associated with "Drivers" and "ATi" are all but ancient concerns in the single GPU arena.
  • papapapapapapapababy - Thursday, April 2, 2009 - link

    BS. I have a pretty beefy pc, that doesn't mean im going to stop demanding for efficiency when it comes to memory usage and to reduce the shear amount of stupid services required to run a simple application. This are all FACTS about Ati. But hey, you are free to use vista, buy ATI and end up with a system that is inferior and slower than mine.( performance and feature wise)

    btw, to all the people claiming that cuda and physics are gimmicks... Give me a fn break! U Hypocrites. This cards ARE HUGE GIMMICKS! BEHOLD he MEGA POWAAR! For what? Crysis? Thats just ONE GAME. ONE. UNO. 1. Then what?... Console ports. At the end of the day 99.9% of games today are console ports. The fact is, you don't need this monstrosities in order to run that console crap. Trust me, you may get a boner comparing 600 fps vs 599, but the rest of the - sane- people here, dsnt give a rat ass, expectantly when the original - console game- barely runs at 30fps to begin with.

  • SiliconDoc - Monday, April 6, 2009 - link

    The red roosters cannot face reality my friend. They are insane as you point out.
    cuda, physx, badaboom, the vreveal, the mirrors edge that addicted anand, none of it matters to red roosters - the excessive heat at idle from ati - also thrown out with the bathwater, endless driver issues, forget it - no forced multi gpu - nmevermind, no gamer profiles built into the driver for 100 popular games that nvidia has - forget it - better folding performance, forget it -NOT EVEN CURING CANCER MATTERS WHEN A RED ROOSTER FANBOI IS YAKKING ALONG THAT THEY HAVE NO BIAS.
    Buddy, be angry, as you definitely deserve to be - where we got so many full of it liars is beyond me, but I suspect the 60's had something to do with it. Perhaps it's their tiny nvidia assaulted wallets - and that chip on their shoulder is just never going away.
    I see someone mentioned Nvidia "tortured" the reviewers. LOL
    hahahaahahahahahahaaa
    There is no comparison... here's one they just hate:
    The top ati core, without ddr5, and neutered for the lower tier, is the 4830.
    The top nvd core, without ddr5, and neutered for the lower tier, is the 260/192.
    Compare the 260/192 to the 4830 - and you'll see the absolute STOMPING it takes.
    In fact go up to the 4850, no ddr5 - and it's STILL a kicking to proud of.
    Now what would happen if NVidia put ddr5 on it's HATED G92 "rebrand" core ? LOL We both know - the 9800gtx+ and it's flavors actually competes equivalently with the 4850 - if Nvidia put ddr5 on it, it WOULD BE A 4870 .
    Now, that G80/G92/G92b "all the same" according to the raging red roosters who SCREAM rebranded - is SEVERAL YEARS OLD nvidia technology... that - well - is the same quality as the current top core ati has produced.
    So ATI is 2 years behind on core - luckily they had the one company that was making the dddr5 - to turn that 4850 into a 4870 - same core mind you!
    So, the red roosters kept screaming for a GT200 "in the lower mid range" --- they kept whining nvidia has to do it - as if the 8800/9800 series wasn't there.
    The real reason of course, would be - it could prove to them, deep down inside, that the GT200 "brute force" was really as bad or worse than the 770 core - bad enough that they could make something with it as lowly as the 4830...
    Ahh, but it just hasn't happened - that's what the 2 year old plus rebrand of nvidia is for - competing with the ati core that doesn't have the ddr5 on it.
    Well, this reality has been well covered up by the raging red rooster fanboys for quite some time. They are so enraged, and so deranged, and so filled with endless lies and fudging, that they just simply missed it - or drove it deep within their subconscoius, hoping they would never fully, conscoiusly realize it.
    Well, that's what I'm here for ! :)
    To spread the good word, the word of TRUTH.
  • josh6079 - Friday, April 3, 2009 - link

    I just came from using Cat 9's to 182+'s when I upgraded to nVidia.

    The "efficiency when it comes to memory usage" is a non-issue -- especially on a "beefy pc."

    The windows task manager is not a benchmark leading to conclusive comparisons regarding quality. My Nvidia GPU can (and has) occupied more memory, especially when I utilize nHancer so as to tap the super-sampling capabilities.

    Also, it's something to note that nVidia's latest driver download is 77.0 MB in size, yet ATi's latest is only 38.2 MB.
  • papapapapapapapababy - Saturday, April 4, 2009 - link

    1) Nhancer is just an optional utility, optional. IF want to check the gpu temps i just use gpuz, or everest, if i want to overclock i just use rivaturner, for the rest i have the nvidia control panel.

    The ccc, not only is a bloated crap, it also requires Ms NET framework, and spawns like 45 extra services running non stop ALL THE TIME, clogging my pc, and the thing dsnt even work! GREAT WORK ATI! CCC is stupidly slow and broken. Se, i dont need a massive mega all in one solution that doesn't work and runs like ass.


    2) YOUR Nvidia GPU. YOUR. Thats the key word, here. Your fault. Just like that windows task manager of yours, it seems to me you just didnt know how to use that nvidia gpu . And you need that knowledge in order to form conclusive comparisons regarding efficiency.

    3) i made a gif, just for you. here. try no to hurt yourself reading it.

    http://i39.tinypic.com/5ygu91.jpg">http://i39.tinypic.com/5ygu91.jpg


    3) upgrade your adsl? btw the nvidia driver includes the extra functionality, that ati dsnt even have. ( and hint, it doesn't pollute your pc!)



  • tamalero - Sunday, April 5, 2009 - link

    sorry to bust your bubble, but your screenshots is no proof, its clear you removed a LOT OF PROCESSES just to take the screenshot, how about if you take the FULL desktop screenshot that shows the nvidia panel loaded?
    because it doesnt seem to be in the process list.

    also you're liying, I got an ATI 3870. and I only got 3 processes of ATI, one of them being the "AI" tool for game optimizations(using the latest drivers).
    and I agree with Anandtech for first time ever, PhysX is just not ready for the "big thing"
    most of the things they bloat are just "tech demos" or very weak stuff, Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel.

    and funny you mention framework, because a lot of new games and programs NOW NEED the framework fundation, or at least the C++ redistribuitable groups.


    also lately I've been reading a lot of fuzz related to the 275's optomizations, wich in many games forces games to use less AA levels than the chosen ones. and thus giving the "edge" to the 275 vs the 4890. (MSAA vs TRSS)
    I suppose again NVidia as been playing the dirty way.
    and its gets annoying how Nvidia as been doing that for quite a bit to keep the dumb thing of "opening a can of whoop ass"
  • papapapapapapapababy - Monday, April 6, 2009 - link

    "you removed a LOT OF PROCESSES", dumbass... if its not a necessary service, its turned off REGARDLESS OF THE videocard . maybe you should try to do the same? (lol just 3 ati services, run the ccc and see) btw, if i CHOOSE to use the nvidia control panel, THEN a new nvidia service starts, THEN as soon as I CLOSE THE control panel ( the process disappears from the task manager. THAT 3 ATI SERVICES OF YOURS ARE RUNNING ALL THE FRINGIN TIME, DUMMY, Remaining resident in YOUR memory REGARDLESS IF YOU USE OR NOT THE CCC. AND THEY GET BIGGER, AND BIGGER, AND BIGGER. ALSO YOU HAVE TO USE THE NET CRAP. (EXTRA SERVICES!) AND FINALLY, THE CCC RUNS LIKE ASS. so there, i WIN. YOU LOOSE. END OF STORY.
  • tamalero - Thursday, April 9, 2009 - link

    hang on, since when you need the CCC panel to be ON ( Ie, loaded and not in the tray ) to play games?
    are you a bit dumb?

    second, why you didnt filter out the services then?
    your screenshot is bull
    its almost like you ran WinXP in safe mode just to take the screenshot and claim your "memory superiority".

    like I said, show us a full screen that shows the nvidia panel LOADED .


    your argument is stupid .
    4 Mb of Ram must be a LOT for you? (thats what my ATI driver uses currently on vista X64.. )
    btw, theres also an option in ATI side to remove the panel from the tray.
    the tray serves a similar function as ATI TOOL ( Ie, fast resolution , color dept and frecuency changes )

    play apples with apples if you want to make a smart conversation.
    "runs like ass", makes me wonder how old are you, 14 years old?
    and my CC runs very fine, thank you!, not a single error.


    also, I got all frameworks installed and even when programs loaded, I dont see any "framework" services running, nor application, so please, get your head out of your ass.
    you're just like this pseudo SiliconDr who spreads only FUD and insults.
  • SiliconDoc - Friday, April 24, 2009 - link

    Besides all your errors and the corrections issued, it comes down to you claiming " Don't load the software that came with the ATI card because it's a fat bloated pig thqt needs to be gotten rid of".
    Yes, and most people who want the performance have to do that.
    Sad, isn't it ?
    You do know you got spanked badly, and used pathetic 3rd grader whines like "~ your screencap is fake" after he had to correct you on it all....
    Just keep it shut until you have a valid point - stop removing all doubt.

Log in

Don't have an account? Sign up now