New Drivers From NVIDIA Change The Landscape

Today, NVIDIA will release it's new 185 series driver. This driver not only enables support for the GTX 275, but affects performance in parts across NVIDIA's lineup in a good number of games. We retested our NVIDIA cards with the 185 driver and saw some very interesting results. For example, take a look at before and after performance with Race Driver: GRID.

As we can clearly see, in the cards we tested, performance decreased at lower resolutions and increased at 2560x1600. This seemed to be the biggest example, but we saw flattened resolution scaling in most of the games we tested. This definitely could affect the competitiveness of the part depending on whether we are looking at low or high resolutions.

Some trade off was made to improve performance at ultra high resolutions at the expense of performance at lower resolutions. It could be a simple thing like creating more driver overhead (and more CPU limitation) to something much more complex. We haven't been told exactly what creates this situation though. With higher end hardware, this decision makes sense as resolutions lower than 2560x1600 tend to perform fine. 2560x1600 is more GPU limited and could benefit from a boost in most games.

Significantly different resolution scaling characteristics can be appealing to different users. An AMD card might look better at one resolution, while the NVIDIA card could come out on top with another. In general, we think these changes make sense, but it might be nicer if the driver automatically figured out what approach was best based on the hardware and resolution running (and thus didn't degrade performance at lower resolutions).

In addition to the performance changes, we see the addition of a new feature. In the past we've seen the addition of filtering techniques, optimizations, and even dynamic manipulation of geometry to the driver. Some features have stuck and some just faded away. One of the most popular additions to the driver was the ability to force Full Screen Antialiasing (FSAA) enabling smoother edges in games. This features was more important at a time when most games didn't have an in-game way to enable AA. The driver took over and implemented AA even on games that didn't offer an option to adjust it. Today the opposite is true and most games allow us to enable and adjust AA.

Now we have the ability to enable a feature, which isn't available natively in many games, that could either be loved or hated. You tell us which.

Introducing driver enabled Ambient Occlusion.

What is Ambient Occlusion you ask? Well, look into a corner or around trim or anywhere that looks concave in general. These areas will be a bit darker than the surrounding areas (depending on the depth and other factors), and NVIDIA has included a way to simulate this effect in it's 185 series driver. Here is an example of what AO can do:

Here's an example of what AO generally looks like in games:

This, as with other driver enabled features, significantly impacts performance and might not be able to run on all games or at all resolutions. Ambient Occlusion may be something some gamers like and some do not depending on the visual impact it has on a specific game or if performance remains acceptable. There are already games that make use of ambient occlusion, and some games that NVIDIA hasn't been able to implement AO on.

There are different methods to enable the rendering of an ambient occlusion effect, and NVIDIA implements a technique called Horizon Based Ambient Occlusion (HBAO for short). The advantage is that this method is likely very highly optimized to run well on NVIDIA hardware, but on the down side, developers limit the ultimate quality and technique used for AO if they leave it to NVIDIA to handle. On top of that, if a developer wants to guarantee that the feature work for everyone, they would need implement it themselves as AMD doesn't offer a parallel solution in their drivers (in spite of the fact that they are easily capable of running AO shaders).

We haven't done extensive testing with this feature yet, either looking for quality or performance. Only time will tell if this addition ends up being gimmicky or really hits home with gamers. And if more developers create games that natively support the feature we wouldn't even need the option. But it is always nice to have something new and unique to play around with, and we are happy to see NVIDIA pushing effects in games forward by all means possible even to the point of including effects like this in their driver.

In our opinion, lighting effects like this belong in engine and game code rather than the driver, but until that happens it's always great to have an alternative. We wouldn't think it a bad idea if AMD picked up on this and did it too, but whether it is more worth it to do this or spend that energy encouraging developers to adopt this and comparable techniques for more complex writing is totally up to AMD. And we wouldn't fault them either way.

Index The Cards and The Test
Comments Locked

294 Comments

View All Comments

  • SiliconDoc - Monday, April 6, 2009 - link

    LOL - antoher hidden red rooster bias uncovered...
    Umm... look, when there's a new ati card, there's no talking about crunching down on former ati cards - OK ? That just is NOT allowed.
    " No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability "
    Dude, not allowed !
    PS- Don't mention how this card is going to smash the "4870" "profit" "flagship" - gee now just don't talk about it - don't mention it - look, there's no rooster crying in fps gaming, ok ?
  • Torquer350 - Friday, April 3, 2009 - link

    Props to ATi for delivering a very compelling product. I admit I've always been an Nvidia fan, and I'll generally forgive them a single generational performance loss to ATi, but I've recommended ATi products recently to friends due to their resurgent desirability.

    That being said, am I the only one who detects a subtle but distinct underlying disdain for Nvidia? So they tried to market the hell out of you - so what? They are trying to sell cards here. Why the surprise that sales and marketing people are trying to do exactly what they're paid to do? Congrats for being smart enough to see it for what it is, but jeers for making an issue of it as if its some kind of new tactic. Has AMD/ATi never done the same?

    CUDA and PhysX are compelling, but I agree not a good reason to overcome a significant gap between Nvidia and ATi at a comparable price point. You clearly agree, but it seems like what little praise you offer is begrudging in the extreme.

    Nvidia has definitely acted in bad form in a number of ways throughout this very lengthy generation of hardware. However, you guys are journalists and in my opinion should make a more concerted effort to leave the vitriol and sensationalism at the door, regardless of who it is that is being reviewed. That kind of emotional reaction, personal opinion, irritation, etc is better served for your blog posts than a review article.

    Love the site, keep up the good work. Nobodys perfect.
  • SiliconDoc - Monday, April 6, 2009 - link

    Yeah thanks for noticing, too. It been going on a long time. Notice how now, suddenly when ati doesn't have 2560 sewn up - it doesn't matter anymore ... LOL
    Of course the "brilliiantly unbiased" reviewers will claim they did a poll on monitor resolution useage, and therefore sudenyl came to their conclusion about $2,000.00 monitor users, when they tiddled and taddled for years about 10 bucks between framerates and nvidia ati - and chose ati for the 10 bucks difference.
    Yep, 10 bucks matters, but $1,700.00 difference for a monitor doesn't matter until they take a poll. Now they didn't say it, but they will - wait it's coming...
    Just like I kept pointing out when they raved about ati taking the 30" resolution and not much if anything else, that declaring it the winner wasn't right. Now of course, when ati isn't winning the 30 rez - yes, well, they finally caught on. No bias here ! Nothing to notice, pure professionalism, and hatred of cuda and physx for it's lack of ability to run on ati cards is fully justified, and should offer NO advantage to nvidia when making a purchase decision ! LOL
    OMG ! they're like GONERZ man.
  • Dried - Friday, April 3, 2009 - link

    Best review so far. And nice cards BTW, they are both worth it, but i like the 4890 better
    Funny thing is that GTX 275 > GTX 280.
    But my guess is that GTX 280 benefits more from overclocking.
  • Arbie - Friday, April 3, 2009 - link

    Because of my PC's location I am concerned with idle power, and purchase based on that if other specs and price are even comparable. Peak power doesn't matter as long as it's within the capability of my 800W PSU.

    I bought an ATI HD4850 last year because it idled significantly lower than the 4870, and it would run everything in sight. A great card. The Nvidia GTX 260 and 280 had even better performance vs idle power ratios but were way too expensive at the time.

    So I think Nvidia takes the laurels now with the GTX 275. 30W less (!) than the HD 4890 at idle, with essentially the same performance. If I were shopping now it would be a VERY easy choice.

    I really hope ATI can get their idle power down too. They need to pay more attention to throttling back or downpowering circuits that aren't needed in 2D modes.
  • helldrell666 - Friday, April 3, 2009 - link

    Use the radeon bios editor to edit the 2d profile and then downclock your gpu frequencies.
  • OCedHrt - Friday, April 3, 2009 - link

    The power consumption on the 4890 really interests me. While it uses more than 275 at idle, it uses less under load. Also, it is a significant drop from the 4870 which is a slower card.
  • bobvodka - Friday, April 3, 2009 - link

    So, on the charge of drivers; I've gone from recently having a GT8800GTX 512Meg to a HD4870X2 2gig and if anything I've seen stability improvements between the two. Or to put it another way NV drivers were bluescreening my Vista install when I was doing nothing more than using my TV card and it was crashing in a DirectDraw DLL. Nice.

    Not to say AMD hasn't had issues; trying to use hardware acceleration with any bluray play back resulted in a bluescreen due to the gpu going into an infinite loop. Nice. Fortunately, unlike the DDraw error above, I could at least turn off hardware acceleration (and honestly, with an i7 it's not like I needed it).

    So, stability wise it's a wash.
    As for the memory usage complaints about CCC;
    Unless it is running it is NOT taking up physical memory. Like many things in the windows world it might load something into the background but this is quickly paged out and doesn't live in ram. Even if it does living in ram for a short period of time being inactive it will be paged out as soon as memory presure requires it. The simple fact is unused ram is wasted ram; this is why I'm glad Vista uses 10gig of my 12 for cache when it isn't needed for anything else, it speeds up the system.

    Cuda.. well, the idea is nice and I like the idea but as mentioned in the article unless you have cross vendor support it isn't as useful as it could be. OpenCL and, for games, DX11's compute shaders are going to make life intresting for both Cuda and AMD's option. I will say this much; I suspect you'll get better performance from NV, AMD and indeed Larrabee when it appears by going 'to the metal' with them but as with many things in the software world you have to trade something for speed.

    Now, PhysX.. well, this one is even more fun (and I guess it effects Cuda as well to a degree). Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. With Win7 however the rules change slight and you'll be able to run, with WDDM1.1 drivers, cards from both vendors at once. Right away this paints an intresting landscape for those intrested; if you want an AMD card but also want some PhysX hardware power than you'll be able to slide in a 'cheap' NV series card to use for that reason (or indeed if you have an old series 8 laying about use that if the driver supports it).

    Of course, with Havoc going OpenCL and being free for games which retail for <$10 (iirc) this is probably going to be much of a muchness in the end, but it's an intresting idea at least.
  • SiliconDoc - Monday, April 6, 2009 - link

    Except you can run 2 nvidia cards, one for gaming, the other for physx.... so red fanboys are sol.

    "Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. "

    WRONG, it's TWO nvidia or just ONE ati. Hello - you knew it - but you didn't say it that way - makes ati look bad, and we just cannot have that here....
  • Rhino2 - Monday, April 13, 2009 - link

    The hell are you talking about? Crossfire works in vista just fine.

Log in

Don't have an account? Sign up now