Image Quality & AA

When it comes to image quality, the big news from NVIDIA for Fermi is what NVIDIA has done in terms of anti-aliasing of fake geometry such as billboards. For dealing with such fake geometry, Fermi has several new tricks.

The first is the ability to use coverage samples from CSAA to do additional sampling of billboards that allow Alpha To Coverage sampling to fake anti-alias the fake geometry. With the additional samples afforded by CSAA in this mode, the Fermi can generate additional transparency levels that allow the billboards to better blend in as properly anti-aliased geometry would.

The second change is a new CSAA mode: 32x. 32x is designed to go hand-in-hand with the CSAA Alpha To Coverage changes by generating an additional 8 coverage samples over 16xQ mode for a total of 32 samples and giving a total of 63 possible levels of transparency on fake geometry using Alpha To Coverage.

In practice these first two changes haven’t had the effect we were hoping for. Coming from CES we thought this would greatly improve NVIDIA’s ability to anti-alias fake geometry using cheap multisampling techniques, but apparently Age of Conan is really the only game that greatly benefits from this. The ultimate solution is for more developers of DX10+ applications to enable Alpha To Coverage so that anyone’s MSAA hardware can anti-alias their fake geometry, but we’re not there yet.

So it’s the third and final change that’s the most interesting. NVIDIA has added a new Transparency Supersampling (TrSS) mode for Fermi (ed: and GT240) that picks up where the old one left off. Their previous TrSS mode only worked on DX9 titles, which meant that users had few choices for anti-aliasing fake geometry under DX10 games. This new TrSS mode works under DX10, it’s as simple as that.

So why is this a big deal? Because a lot of DX10 games have bad aliasing of fake geometry, including some very popular ones. Under Crysis in DX10 mode for example you can’t currently anti-alias the foliage, and even brand-new games such as Battlefield: Bad Company 2 suffer from aliasing. NVIDIA’s new TrSS mode fixes all of this.


Bad Company 2 DX11 Without Transparency Supersampling


Bad Company 2 DX11 With Transparency Supersampling

The bad news is that it’s not quite complete. Oh as you’ll see in our screenshots it works, but the performance hit is severe. It’s currently super-sampling too much, resulting in massive performance drops. NVIDIA is telling us that this should be fixed next month, at which time the performance hit should be similar to that of the old TrSS mode under DX9. We’ve gone ahead and taken screenshots and benchmarks of the current implementation, but keep in mind that performance should be greatly improving next month.

So with that said, let’s look at the screenshots.

NVIDIA GeForce GTX 480 NVIDIA GeForce GTX 285 ATI Radeon HD 5870 ATI Radeon HD 4890
0x 0x 0x 0x
2x 2x 2x 2x
4x 4x 4x 4x
8xQ 8xQ 8x 8x
16xQ 16xQ DX9: 4x DX9: 4x
32x DX9: 4x DX9: 4x + AAA DX9: 4x + AAA
4x + TrSS 4x DX9: 4x + TrSS DX9: 4x + SSAA  
DX9: 4x      
DX9: 4x + TrSS      

With the exception of NVIDIA’s new TrSS mode, very little has changed. Under DX10 all of the cards produce a very similar image. Furthermore once you reach 4x MSAA, each card producing a near-perfect image. NVIDIA’s new TrSS mode is the only standout for DX10.

We’ve also include a few DX9 shots, although we are in the process of moving away from DX9. This allows us to showcase NVIDIA’s old TrSS mode, along with AMD’s Adapative AA and Super-Sample AA modes. Note how both TrSS and AAA do a solid job of anti-aliasing the foliage, which makes it all the more a shame that they haven’t been available under DX10.


Click to Enlarge


Click to Enlarge

When it comes to performance, keep in mind that both AMD and NVIDIA have been trying to improve their 8x MSAA performance. When we reviewed the Radeon 5870 back in September we found that AMD’s 8x MSAA performance was virtually unchanged, and 6 months later that still holds true. The performance hit moving from 4x MSAA to 8x MSAA on both Radeon cards is roughly 13%. NVIDIA on the other hand took a stiffer penalty under DX10 for the GTX 285, where there it fell by 25%. But now with NVIDIA’s 8x MSAA performance improvements for Fermi, that gap has been closed. The performance penalty for moving to 8x MSAA over 4x MSAA is only 12%, putting it right up there with the Radeon cards in this respect. With the GTX 480, NVIDIA can now do 8x MSAA for as cheap as AMD has been able to

Meanwhile we can see the significant performance hit on the GTX 480 for enabling the new TrSS mode under DX10. If NVIDIA really can improve the performance of this mode to near-DX9 levels, then they are going to have a very interesting AA option on their hands.

Last but not least, there’s anisotropic filtering quality. With the Radeon 5870 we saw AMD implement true angle-independent AF and we’ve been wondering whether we would see this from NVIDIA. The answer is no: NVIDIA’s AF quality remains unchanged from the GTX200 series. In this case that’s not necessarily a bad thing; NVIDIA already had great AF even if it was angle-dependant. More to the point, we have yet to find a game where the difference between AMD and NVIDIA’s AF modes have been noticeable; so technically AMD’s AF modes are better, but it’s not enough that it makes a practical difference


GeForce GTX 480


GeForce GTX 285


Radeon 5870

Compute The Test
POST A COMMENT

197 Comments

View All Comments

  • WiNandLeGeNd - Saturday, March 27, 2010 - link

    I think this was a great review, as mentioned previously, very objective. I think though that I may get a 480, because when I buy a card I keep it for 3 to 4 years before I get a new one, aka every other gen. And seeing that tessellation is really the gift horse of DX11 and how much more tessellation power is in the 480's, I think it could very much pay off in the future. If not then I spent an extra $85 for a tad extra performance as I just pre-ordered one for 485 and the 5870's are at $400 still.

    My only concern is heat and power, but most of the cards have a life time warranty. Hopefully my OCZ GamerXtreme 850W can handle it at max loads. The two 12v rails for the two 6 pin PCI-X connectors are 20 A each, I saw 479w max consumption, however that was furmark, at 12v that's 39.5 amps, so it would be extremely close if there is ever a game to utilize that much power. Although If I recall ATI specifically stated a while back to not use that as it pushes loads that are not possible to see in an actual game, I think they had an issue with the 4000 series burning out power regulators, correct me if I'm wrong.
    Reply
  • Alastayr - Saturday, March 27, 2010 - link

    I'm with sunburn on this one. Your reasoning doesn't make much sense. You must've not followed the GPU market for the last few years because

    first) "every other gen" would mean a 2 year cycle
    second) Nothing's really gonna pay off in the future, as the future will bring faster cards for a fraction of the price. You'd only enjoy those questionable benefits until Q4, when AMD releases Northern Islands and nVidia pops out GF100b or whatever they'll call it.
    third) Tessellation won't improve further that fast. If at all, developers will focus on the lowest common denominator, which would be Cypress. Fermi's extra horse power will most likely stay unused.
    fourth) Just look at your power bill. The 25W difference with a "typical" Idle scheme (8h/day; 350d/y) comes to 70kWh which where I live translates to around $20 per year. That's Idle *only*. You're spending way more than just $85 extra on that card.
    fifth) The noise will kill you. This isn't a card than just speeds up for no reason. You can't just magically turn down the fan from 60% to 25% and still enjoy Temps of <90°C like on some GTX 260 boards. Turn up your current fan to 100% for a single day. Try living through that. That's probably what you're buying.

    In the end everyone has to decide this for himself. But for someone to propose keeping a GTX 480 in his PC for a whopping 3-4 years... I don't know man. I'd rather lose a finger or two. ;)

    tl;dr I know, I know. But really people. Those cards aren't hugely competetive, priced too high and nV's drivers suck as much as ATi's (allegedly) do nowadays. Whis is to say neither do.

    I could honestly bite me right now. I had a great deal for a 5850 in Nov. and I waited for nV to make their move. Now the same card will cost me $50 more, and I've only wasted time by waiting for the competetive GTX 470 that never was. Argh.
    Reply
  • Sunburn74 - Saturday, March 27, 2010 - link

    Thats kind of bad logic imo. I'm not fanboy on either side, but it's clear to me that Nvidia targeted the performance of their cards to fit in exactly between the 5970, the 5870, and 5850. Its much harder to release a card not knowing what the other guy truly has as opposed to releasing a card knowing exactly what sort of performance levels you have to hit.

    Two, realistically, think of the noise. I mean ifyou've ever heard a gtx 260 at 100 percent fan speed, thats the sort of fan noises you're going to be experiencing on a regular basis. Its not a mild difference.

    And three, realistically for the premium you're paying for the extra performance (which is not useful right now as there are no games to take advantage of it) as well as for the noise, heat and power, you could simply buy the cheaper 5870, save that 85-150 dollars extra, and sell off the 5870 when the time is right.

    I just don't see why anyone would buy this card unless they were specifically taking advantage of some of the compute functions. As a consumer card it is a failure. Power and heat be damned, the noise the noise! Take your current card up to 100 percent fan speed, and listen to it for a few mins, and thats what you should about expect from these gpus.
    Reply
  • andyo - Saturday, March 27, 2010 - link

    I too am getting the warning message with Firefox 3.6.2. Posting this on IE. Here's the message:

    http://photos.smugmug.com/photos/820690277_fuLv6-O...">http://photos.smugmug.com/photos/820690277_fuLv6-O...
    Reply
  • JarredWalton - Saturday, March 27, 2010 - link

    We're working on it. Of course, the "Internet Police" have now flagged our site as malicious because of one bad ad that one of the advertisers put up, and it will probably take a week or more to get them to rescind the "Malware Site" status. Ugh.... Reply
  • jeffrey - Saturday, March 27, 2010 - link

    Give the advertiser that put up the bad ad hell! Reply
  • LedHed - Saturday, March 27, 2010 - link

    The people who are going to buy the GTX 480/470 are enthusiast who most likely bought the GTX 295 or had 200 Series SLI. So not including the 295 in every bench is kind of odd. We need to see how the top end of the last gen does against the new gen top end. Reply
  • Ryan Smith - Saturday, March 27, 2010 - link

    What chart is the 295 not in? It should be in every game test. Reply
  • kc77 - Saturday, March 27, 2010 - link

    Well the 295 beats the 470 in most benches so there's no need to really include it in all benches. Personally I think the 480 is the better deal. Although I am not buying those cards until a respin/refresh, those temps and power requirements are just ridiculous. Reply
  • bigboxes - Saturday, March 27, 2010 - link

    I know you "upgraded" your test PSU to the Antec 1200W PSU, but did you go back and try any of these tests/setups with your previous 850W PSU to see if could handle the power requirements. It seemed that only your 480 SLI setup drew 851W in total system in the Furmark load test. Other than that scenario it looks like your old PSU should handle the power requirements just fine. Any comments? Reply

Log in

Don't have an account? Sign up now