The First TXAA Game & The Test

With the release of the NVIDIA’s 304.xx drivers a couple of months ago, NVIDIA finally enabled driver support on Kepler for their new temporal anti-aliasing technology. First announced with the launch of Kepler, TXAA is another anti-aliasing technology to be developed by Timothy Lottes, an engineer in NVIDIA’s developer technology group. In a nutshell, TXAA is a wide tent (>1px) MSAA filter combined with a temporal filter (effectively a motion-vector based frame blend) that is intended to resolve that pesky temporal aliasing that can be seen in motion in many games.

Because TXAA requires MSAA support and motion vector tracking by the game itself, it can only be used in games that specifically implement it. Consequently, while NVIDIA had enabled driver support for it, they’ve been waiting on a game to be released that implements it. That release finally happened last week with a patch for the MMO The Secret World, which became the first game with TXAA support.

This isn’t meant to be an exhaustive review of TXAA (MMOs and deterministic testing are like oil and water), but seeing as how this is the first time TXAA has been enabled we did want to comment on it.

On the whole, what NVIDIA is trying to accomplish here is to implement movie-like anti-aliasing at a reasonable performance cost. Traditionally SSAA would be the solution here (just like it is to most other image aliasing problems), but of course SSAA is much too expensive most of the time. At its lower setting it is just 2x MSAA plus the temporal component, which makes the process rather cheap.

Unfortunately by gaming standards it’s also really blurry. This is due to the combination of the wide tent MSAA samples – which if you remember your history, ATI tried at one time – and the temporal filter blending data from multiple frames. TXAA does a completely fantastic job of eliminating temporal and other forms of aliasing, but it does so at a notable cost to image clarity.

Editorially speaking we’ll never disparage NVIDIA for trying new AA methods – it never hurts to try something new – however at the same time we do reserve the right to be picky. We completely understand the direction NVIDIA went with this and why they did it, especially since there’s a general push to make games more movie-like in the first place, but we’re not big fans of the outcome. You would be hard pressed to find someone that hates jaggies more than I (which is why we have SSAA in one of our tests), but as an interactive medium I have come to expect sharpness, sharpness that would make my eyes bleed. Especially when it comes to multiplayer games, where being able to see the slightest movement in the distance can be a distinct advantage.

To that end, TXAA is unquestionably an interesting technology and worth keeping an eye on in the future, but practically speaking AMD’s efforts to implement complex lighting cheaply on a forward renderer (and thereby making MSAA cheap and effective) are probably more relevant to improving the state of AA. But this is by no means the final word, and we’ll certainly revisit TXAA in detail in the future once it’s enabled on a game that offers a more deterministic way of testing image quality.

The Test

NVIDIA’s GTX 660 Ti launch drivers are 305.37, which are a further continuation of the 304.xx branch. Compared to the previous two 304.xx drivers there are no notable performance changes or bug fixes that we’re aware of.

Meanwhile on the AMD side we’re continuing to use the Catalyst 12.7 betas released back in late June. AMD just released Catalyst 12.8 yesterday, which appear to be a finalized version of the 12.7 driver.

On a final note, for the time being we have dropped Starcraft II from our tests. The recent 1.5 patch has had a notable negative impact on our performance (and disabled our ability to play replays without logging in every time), so we need to further investigate the issue and likely rebuild our entire collection of SC2 benchmarks.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Asus PA246Q
Video Cards: AMD Radeon HD 6970
AMD Radeon HD 7870
AMD Radeon HD 7950
AMD Radeon HD 7950B
AMD Radeon HD 7970
NVIDIA GeForce GTX 560 Ti
NVIDIA GeForce GTX 570
Zotac GeForce GTX 660 Ti AMP! Edition
EVGA GeForce GTX 660 Ti Superclocked
Gigabyte GeForce GTX 660 Ti OC
NVIDIA GeForce GTX 670
Video Drivers: NVIDIA ForceWare 304.79 Beta
NVIDIA ForceWare 305.37
AMD Catalyst 12.7 Beta
OS: Windows 7 Ultimate 64-bit

 

Meet The Gigabyte GeForce GTX 660 Ti OC Crysis: Warhead
Comments Locked

313 Comments

View All Comments

  • Oxford Guy - Thursday, August 16, 2012 - link

    What is with the 285 being included? It's not even a DX 11 card.

    Where is the 480? Why is the 570 included instead of the 580?

    Where is the 680?
  • Ryan Smith - Saturday, August 18, 2012 - link

    The 285 was included because I wanted to quickly throw in a GTX 285 card where applicable, since NVIDIA is promoting the GTX 660 Ti as a GTX 200 series upgrade. Basically there was no harm in including it where we could.

    As for the 480, it's equivalent to the 570 in performance (eerily so), so there's never a need to break it out separately.

    And the 680 is in Bench. It didn't make much sense to include a card $200 more expensive which would just compress the results among the $300 cards.
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So you're saying the 680 is way faster than the 7970 which you included in every chart, since the 7970 won't compress those $300 card results.
    Thanks for admitting that the 7970 is so much slower.
  • Pixelpusher6 - Friday, August 17, 2012 - link

    Thanks Ryan. Great review as always.

    I know one of the differentiating factors for the Radeon 7950s is the 3GB of ram but I was curious are there any current games which will max out 2GB of RAM with high resolution, AA, etc.?

    I think it's interesting how similar AMDs and Nvidias GPUs are this generation. I believe Nvidia will be releasing the GTX 660 non Ti based on GK106. Leaked specs seem to be similar to this card but the texture units will be reduced to 64. I wonder how much of a performance reduction this will account for. I think it will be hard for Nvidia to get the same type of performance / $ as say GTX 460 / 560 Ti this generation because of having to have GK104 fill in more market segments.

    Also I wasn't aware that Nvidia was still having trouble meeting demand with GK104 chips I thought those issues were all cleared up. I think when AMD released their 7000 series chips they should have taken advantage of being first to market and been more competitive on price to grow market share rather than increase margins. At that time someone sitting on 8800GT era hardware would be hard pressed to upgrade knowing that AMDs inflated prices would come down once Nvidia brought their GPUs to market. People who hold on to their cards for a number of years is unlikely to upgrade 6 months later to Nvidias product. If AMD cards were priced lower at this time a lot more people would have bought them, thereby beating Nvidia before they even have a card to market. I do give some credit to AMD for preparing for this launch and adjusting prices, but in my opinion this should have been done much earlier. AMD management needs to be more aggressive and catch Nvidia off guard, rather than just reacting to whatever they do. I would "preemptively" strike at the GTX 660 non Ti by lowering prices on the 7850 to $199. Instead it seems they'll follow the trend and keep it at $240-250 right up until the launch of the GTX 660 then lower it to $199.
  • Ryan Smith - Saturday, August 18, 2012 - link

    Pixelpusher, there are no games we test that max out 2GB of VRAM out of the box. 3GB may one day prove to be advantageous, but right even at multi-monitor resolutions 2GB is doing the job (since we're seeing these cards run out of compute/render performance before they run out of RAM).
  • Sudarshan_SMD - Friday, August 17, 2012 - link

    Where are naked images of the card?
  • CeriseCogburn - Thursday, August 23, 2012 - link

    You don't undress somebody you don't love.
  • dalearyous - Friday, August 17, 2012 - link

    it seems the biggest disappointment i see in comments is the price point.

    but if this card comes bundled with borderlands 2, and you were already planning on buying borderlands 2 then this puts the card at $240, worth it IMO.
  • rarson - Friday, August 17, 2012 - link

    but it's the middle of freaking August. While Tahiti was unfortunately clocked a bit lower than it probably should have been, and AMD took a bit too long to bring out the GE edition cards, Nvidia is now practically 8 months behind AMD, having only just released a $300 card. (In the 8 months that have gone by since the release of the 7950, its price has dropped from $450 to $320, effectively making it a competitor to the 660 Ti. AMD is able to compete on price with a better-performing card by virtue of the fact that it simply took Nvidia too damn long to get their product to market.) By the time the bottom end appears, AMD will be ready for Canary Islands.

    It's bad enough that Kepler (and Fermi, for that matter) was so late and so not available for several months, but it's taking forever to simply roll out the lower-tier products (and yes, I know 28nm wafers have been in short supply, but that's partially due to Nvidia's crappy Kepler yields... AMD have not had such supply problems). Can you imagine what would have happened if Nvidia actually tried to release GK110 as a consumer card? We'd have NOTHING. Hot, unmanufacturable nothing.

    Nvidia needs to get their shit together. At the rate they're going, they'll have to skip an entire generation just to get back on track. I liked the 680 because it was a good performer, but that doesn't do consumers any good when it's 4 months late to the party and almost completely unavailable. Perhaps by the end of the year, 28nm will have matured enough and Nvidia will be able to design something that yields decently while still offering the competitiveness that the 680 brought us, because what I'd really like to see is both companies releasing good cards at the same time. Thanks to Fermi and Kepler, that hasn't happened for a while now. Us consumers benefit from healthy competition and Nvidia has been screwing that up for everyone. Get it together, Nvidia!
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So as any wacko fanboy does, you fault nVidia for releasing a card later that drives the very top end tier amd cards down from the 579+ shipping I paid to $170 less plus 3 free games.
    Yeah buddy, it's all nVidia's fault, and they need to get their act together, and if they do in fact get their act together, you can buy the very top amd card for $150, because that's likely all it will be worth.
    Good to know it's all nVidia's fault. AMD from $579+plus ship to $409 and 3 free games and nVidia sucks for not having it's act together.
    The FDA as well as the EPA should ban the koolaid you're drinking.

Log in

Don't have an account? Sign up now