New Drivers From NVIDIA Change The Landscape

Today, NVIDIA will release it's new 185 series driver. This driver not only enables support for the GTX 275, but affects performance in parts across NVIDIA's lineup in a good number of games. We retested our NVIDIA cards with the 185 driver and saw some very interesting results. For example, take a look at before and after performance with Race Driver: GRID.

As we can clearly see, in the cards we tested, performance decreased at lower resolutions and increased at 2560x1600. This seemed to be the biggest example, but we saw flattened resolution scaling in most of the games we tested. This definitely could affect the competitiveness of the part depending on whether we are looking at low or high resolutions.

Some trade off was made to improve performance at ultra high resolutions at the expense of performance at lower resolutions. It could be a simple thing like creating more driver overhead (and more CPU limitation) to something much more complex. We haven't been told exactly what creates this situation though. With higher end hardware, this decision makes sense as resolutions lower than 2560x1600 tend to perform fine. 2560x1600 is more GPU limited and could benefit from a boost in most games.

Significantly different resolution scaling characteristics can be appealing to different users. An AMD card might look better at one resolution, while the NVIDIA card could come out on top with another. In general, we think these changes make sense, but it might be nicer if the driver automatically figured out what approach was best based on the hardware and resolution running (and thus didn't degrade performance at lower resolutions).

In addition to the performance changes, we see the addition of a new feature. In the past we've seen the addition of filtering techniques, optimizations, and even dynamic manipulation of geometry to the driver. Some features have stuck and some just faded away. One of the most popular additions to the driver was the ability to force Full Screen Antialiasing (FSAA) enabling smoother edges in games. This features was more important at a time when most games didn't have an in-game way to enable AA. The driver took over and implemented AA even on games that didn't offer an option to adjust it. Today the opposite is true and most games allow us to enable and adjust AA.

Now we have the ability to enable a feature, which isn't available natively in many games, that could either be loved or hated. You tell us which.

Introducing driver enabled Ambient Occlusion.

What is Ambient Occlusion you ask? Well, look into a corner or around trim or anywhere that looks concave in general. These areas will be a bit darker than the surrounding areas (depending on the depth and other factors), and NVIDIA has included a way to simulate this effect in it's 185 series driver. Here is an example of what AO can do:

Here's an example of what AO generally looks like in games:

This, as with other driver enabled features, significantly impacts performance and might not be able to run on all games or at all resolutions. Ambient Occlusion may be something some gamers like and some do not depending on the visual impact it has on a specific game or if performance remains acceptable. There are already games that make use of ambient occlusion, and some games that NVIDIA hasn't been able to implement AO on.

There are different methods to enable the rendering of an ambient occlusion effect, and NVIDIA implements a technique called Horizon Based Ambient Occlusion (HBAO for short). The advantage is that this method is likely very highly optimized to run well on NVIDIA hardware, but on the down side, developers limit the ultimate quality and technique used for AO if they leave it to NVIDIA to handle. On top of that, if a developer wants to guarantee that the feature work for everyone, they would need implement it themselves as AMD doesn't offer a parallel solution in their drivers (in spite of the fact that they are easily capable of running AO shaders).

We haven't done extensive testing with this feature yet, either looking for quality or performance. Only time will tell if this addition ends up being gimmicky or really hits home with gamers. And if more developers create games that natively support the feature we wouldn't even need the option. But it is always nice to have something new and unique to play around with, and we are happy to see NVIDIA pushing effects in games forward by all means possible even to the point of including effects like this in their driver.

In our opinion, lighting effects like this belong in engine and game code rather than the driver, but until that happens it's always great to have an alternative. We wouldn't think it a bad idea if AMD picked up on this and did it too, but whether it is more worth it to do this or spend that energy encouraging developers to adopt this and comparable techniques for more complex writing is totally up to AMD. And we wouldn't fault them either way.

Index The Cards and The Test
Comments Locked

294 Comments

View All Comments

  • SiliconDoc - Tuesday, April 7, 2009 - link

    Another red rooster who cannot argue with the facts and the truth, and doesn't want them known.
    Perhaps you'd notice, I didn't comment right away when the STORIED review came out, you FOOL.
    I came days later, and made my comments after you had your bs fest of lies, so I don't expect a lot of responders, you DUMMY.
    But you're here, and your response is calling for DEATH.
    Now, if anyone needs to be banned, YOU DO.
    Futhermore, I really don't care if you're here, and have enjoyed some of your posts, but the fact remains, where I have absolutely FACTUALLY retued your BS in some of your posts, you have no response - other than, your own personal rage.
    I'll be glad to see how you can defend yourself, but you obviously cannot.
    Go ahead, there's 22 pages, and I've pointed out your lies several times. Have at it. Good luck, just calling for DEATH, and spewing "ban him!" while carrying your torch of lies is just what I expect from someone who doesn't care what bs they spew.
    You already claimed you can't understand - LOL - of course you can't, you'd have to straighten out yourself and your lies then.
    Good luck doing that.
  • SiliconDoc - Monday, April 6, 2009 - link

    LOL - the folding was crap forever on ati, and now it's slower.
    We know the release date for both cards, and the nvidia is already listed on the egg dude.
    When you're a raging red rooster, nothing matters to you but lying for the 2 billion dollar loser - ati.
  • sidk47 - Friday, April 3, 2009 - link

    You cannot argue with facts and the fact of the matter is that you can't help the world find a cure for cancer or Alzheimer's by buying an ATI!
    So those of you with an Internet connection, should buy an NVidia and fold@home all the time to help make the world a better place!
    Take that ATI and your associated fanboys!
  • x86 64 - Sunday, April 5, 2009 - link

    Folding at home is a total waste and is just an excuse to be smug and think you're special, so there to both of you.

    "Oh I'm going to save the world by buying overpriced hardware and letting some university use it for studying the human genome. I'm such a humanitarian."

    Please, you can justify your over indulgence any way you want but it still doesn't cover up the fact that you're trying to justify sitting on your asses instead of doing some real community work to help change the world.

    Folding@home = Too fat and too lazy to really make an effort.
  • SiliconDoc - Monday, April 6, 2009 - link

    Uhh, dude, they're doing it at college, on like triple TESLA machines with the "supercomputer" motherboards - so you know, go get an education and start whining about unbelievable game framerates - that's what's really going on -
    Professor cuda machine checker " What happened ? "
    Gamer students " Oh, uhh, well it crashed again it was a Crysis, I mean uh, no crisis, last night and it took us about 5 hours to to reset the awesome TESLA cards. We'll come in tonight to keep an eye on it, and clean up the pizza boxes and lock up again professor."
    " Very well."
    WHO LOVES THE EDUCATION OF AMERICA? !!!
    hahahahaha
  • LeonRa - Saturday, April 4, 2009 - link

    Well, since you cannot argue with facts, it's a fact you are a stupid fanboy who doesn't know anything! Check your facts before you post something like that. It is a fact that you can do f@h with an ATI card, as I have been doing it for some time now. So STFU and go spill your hatred somewhere else!
  • SiliconDoc - Tuesday, April 7, 2009 - link

    You're not being honest there. A while back ati either couldn't do it all ( no port ) - or it was so pathetic - they had to make a new port - I know they did the latter, and as far having a long stretch where it wasn't available, or just not used much since it was so pathetically slow in compariosn, the fella has the right idea.
    Furthermore, unless something has recently changed significantly, the ati port is still WAY slower than the Nvidia for folding.
    So anyway, nice try, but telling the truth might actually be something the red rooster crew should start practicing .... or perhaps not, considering lying a whole heckuva lot might make those 2 billion dollar ati loses into "sales" that make "overall a profit" a reality...
    On the other hand, if people continuously notice the lying by the red fans, they might gravitate to the competition, for obvious reasons.
    So, honesty, or more bs ? I think I know what you'll choose.
  • marraco - Friday, April 3, 2009 - link

    I hope to see benchmarks with ATI in charge of graphics, and a Geforce in charge of PhysX.

    ... kind of SLI/crossfire betwen ATIs and Geforces :)

    A value-added of the geforces, is that, once you buy a new card, the old can unload Physics from the new card. Nice. I hate wasting old hardare.

    On other side, most of the games on PhysX nvidia list don't relly work with GPGPU PhysX. Only with the old AGEIA cards.

    Sadly, Crisys and Far Cry don't use PhysX. Only Havoc. And AMD still don't support it in hardware.
  • spinportal - Friday, April 3, 2009 - link

    No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability. No mention of advantage of DX10.1 and the games available. Hey, even bad news is good news sometimes by spotlighting. What is really missing is the bang for buck quality (bucks spent for performance increase), and talk about price depression for the HD 4870 1GB model by 10$ to 15$ with $50 step increments.
    4850 (125)[20.9] 4870 (185)[27.9] 4890 (235)[31.7]
    4870X2 (400)[35.0]
    Nvidia is cramping its own style:
    250 (150)[21.8] 260-216-55 (180)[27] 275 (250?)[31.3]
    280 (290)[30.9] 285 (340)[32.8]
    The GTX280 is dead now, overpriced for those trying to sneak into SLI. The GTX260 is overlapped with Core216 55nm you'd want to get, but Joe Consumer might mistakenly get the other 2 prior versions to clean out old inventory. The GTX285's price is not justified but more power to nVidia if they get the consumer's buck.
    Gladly, by the low temps the dual slot blowback is voiding hot air properly so the vendors are finally manufacturing cards with common sense.
    Too bad we have gone the way with power hungry beastly cards needing two 6-pins.
    Also, too bad the effects of AF and 0x00, 2xAA, 4xAA and 8xMSAA modes are not investigated. It would be interesting to see how saturated the units get as AF and AA gets bumped and what are the best modes for nVidia and AMD.
    Oh, nice blurb for nVidia's shadow enhancement, but ATi/AMD's tesselation enhancement is as much as a hit or miss feature. Will AMD have an tech edge when DX11 tesselation cometh?
  • SiliconDoc - Monday, April 6, 2009 - link

    Hmm, that said, Derek might be crying, since he couldn't stop crowing about that 4850x2 last review - oh boy, you know - I guess he had the heads up and ati told him what card he needed to help push...
    You know how things are.
    Anyway, good observation.

Log in

Don't have an account? Sign up now