The First TXAA Game & The Test

With the release of the NVIDIA’s 304.xx drivers a couple of months ago, NVIDIA finally enabled driver support on Kepler for their new temporal anti-aliasing technology. First announced with the launch of Kepler, TXAA is another anti-aliasing technology to be developed by Timothy Lottes, an engineer in NVIDIA’s developer technology group. In a nutshell, TXAA is a wide tent (>1px) MSAA filter combined with a temporal filter (effectively a motion-vector based frame blend) that is intended to resolve that pesky temporal aliasing that can be seen in motion in many games.

Because TXAA requires MSAA support and motion vector tracking by the game itself, it can only be used in games that specifically implement it. Consequently, while NVIDIA had enabled driver support for it, they’ve been waiting on a game to be released that implements it. That release finally happened last week with a patch for the MMO The Secret World, which became the first game with TXAA support.

This isn’t meant to be an exhaustive review of TXAA (MMOs and deterministic testing are like oil and water), but seeing as how this is the first time TXAA has been enabled we did want to comment on it.

On the whole, what NVIDIA is trying to accomplish here is to implement movie-like anti-aliasing at a reasonable performance cost. Traditionally SSAA would be the solution here (just like it is to most other image aliasing problems), but of course SSAA is much too expensive most of the time. At its lower setting it is just 2x MSAA plus the temporal component, which makes the process rather cheap.

Unfortunately by gaming standards it’s also really blurry. This is due to the combination of the wide tent MSAA samples – which if you remember your history, ATI tried at one time – and the temporal filter blending data from multiple frames. TXAA does a completely fantastic job of eliminating temporal and other forms of aliasing, but it does so at a notable cost to image clarity.

Editorially speaking we’ll never disparage NVIDIA for trying new AA methods – it never hurts to try something new – however at the same time we do reserve the right to be picky. We completely understand the direction NVIDIA went with this and why they did it, especially since there’s a general push to make games more movie-like in the first place, but we’re not big fans of the outcome. You would be hard pressed to find someone that hates jaggies more than I (which is why we have SSAA in one of our tests), but as an interactive medium I have come to expect sharpness, sharpness that would make my eyes bleed. Especially when it comes to multiplayer games, where being able to see the slightest movement in the distance can be a distinct advantage.

To that end, TXAA is unquestionably an interesting technology and worth keeping an eye on in the future, but practically speaking AMD’s efforts to implement complex lighting cheaply on a forward renderer (and thereby making MSAA cheap and effective) are probably more relevant to improving the state of AA. But this is by no means the final word, and we’ll certainly revisit TXAA in detail in the future once it’s enabled on a game that offers a more deterministic way of testing image quality.

The Test

NVIDIA’s GTX 660 Ti launch drivers are 305.37, which are a further continuation of the 304.xx branch. Compared to the previous two 304.xx drivers there are no notable performance changes or bug fixes that we’re aware of.

Meanwhile on the AMD side we’re continuing to use the Catalyst 12.7 betas released back in late June. AMD just released Catalyst 12.8 yesterday, which appear to be a finalized version of the 12.7 driver.

On a final note, for the time being we have dropped Starcraft II from our tests. The recent 1.5 patch has had a notable negative impact on our performance (and disabled our ability to play replays without logging in every time), so we need to further investigate the issue and likely rebuild our entire collection of SC2 benchmarks.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Asus PA246Q
Video Cards: AMD Radeon HD 6970
AMD Radeon HD 7870
AMD Radeon HD 7950
AMD Radeon HD 7950B
AMD Radeon HD 7970
NVIDIA GeForce GTX 560 Ti
NVIDIA GeForce GTX 570
Zotac GeForce GTX 660 Ti AMP! Edition
EVGA GeForce GTX 660 Ti Superclocked
Gigabyte GeForce GTX 660 Ti OC
NVIDIA GeForce GTX 670
Video Drivers: NVIDIA ForceWare 304.79 Beta
NVIDIA ForceWare 305.37
AMD Catalyst 12.7 Beta
OS: Windows 7 Ultimate 64-bit

 

Meet The Gigabyte GeForce GTX 660 Ti OC Crysis: Warhead
Comments Locked

313 Comments

View All Comments

  • CeriseCogburn - Saturday, August 25, 2012 - link

    The 660Ti has a bios SUPER roxxor feature...in the MSI version.. ROFL !! hahaha
    http://www.techpowerup.com/reviews/MSI/GTX_660_Ti_...

    It seems that MSI has added some secret sauce, no other board partner has, to their card's BIOS. One indicator of this is that they raised the card's default power limit from 130 W to 175 W, which will certainly help in many situations.
    The card essentially uses the same power as other cards, but is faster - leading to improved performance per Watt.
    Overclocking works great as well and reaches the highest real-life performance, despite not reaching the lowest GPU clock. This is certainly an interesting development. We will, hopefully, see more board partners pick up this change.
    ROFL HAHAHAAHAAAAAAAAAAA
    So this is the one you want now Galidou.
    " Pros: This thing is pretty amazing. Tried running Skyrim on Ultra, 2k textures, and 14 other visual mods. With this card, I ran it all with no lagg at all, with a temp under 67. Love it. "
    http://www.newegg.com/Product/Product.aspx?Item=N8...
  • Galidou - Tuesday, September 4, 2012 - link

    Gibgabyte did the same, the board power is up to 180 watts if you tweak it and still both overclocked(my wife's gigabyte 660 ti OC and my 7950 sapphire 7950 OC) the 7950 wins hands down at 3 monitor resolution.

    How can you still trying to explain things when the only side of the medal you can speak of is Nvidia. Sorry, I see the good of both while you can't say a good thing about AMD. Both of my computer uses intel overclocked sandy bridge/ivy bridge K cpus, I'm no AMD fan but I can recognize I did the right thing and I did my research and having BOTH freaking cards in HANDS and testing them side by side with my 3570k @ 4,6ghz.

    My 7950 wins @ 3 monitors in skyrim EASILY, you can't say anything to that because you ain't got both cards in hands. Geez, will you freaking understand some day. And no I ain't got any freaking problem with my drivers... And I paid the 7950 the same price than the gtx 660 ti. EXACT same price. 319$ before taxes.

    Geez it's complicated when arguing with you because you ain't open to any opinions/facts other than: AMD IS CRAP, NVIDIA WINS EVERYTHING, AMD IS CRAP, NVIDIA WINS EVERYTHING, HERE'S MY LINK TO A WEBSITE THAT SHOWS THE 660TI WINNING AGAINST A 7970 AT EVERYTHING EVEN 6 MONITORS LOOK LOOK LOOK.
  • TheJian - Friday, August 24, 2012 - link

    I was speaking to their finances. If you see in one of my other posts, I believed they deserved 20bil from Intel, but courts screwed them. That is part of what I meant. They deserved their profits and more. Tough to get profits when Intel is stealing them basically by blocking your products at every end.

    No comment was directed at "dumb" employees. I said it was hard to overcome, not easy. Also that they had the crown for 3 years and weren't allowed to get just desserts. I'm sorry you didn't get that from the posts. I like AMD. I just fear they're on their last financial leg. I've owned their stock 4 times over the last 10 years. There doesn't look like there will be a 5th is all I'm saying. I speak from a stock/company financial position sometimes since I've bought both and follow their income statements. I'm sure they're all great people that work there, no comment on them (besides management's mishandling of Dirk Meyer, ATI overpurchase).
  • felipetga - Thursday, August 16, 2012 - link

    I have been holding to upgrade my GTX 460 256bits. I wonder if this card will be bottlenecked by my C2Q 9550 @ 3.6ghz....
  • dishayu - Thursday, August 16, 2012 - link

    It won't. You need to SLI/CF 2 top end cards for the processor to be a bottleneck.
  • tipoo - Thursday, August 16, 2012 - link

    Only on some games, but the majority aren't as CPU intensive as they are GPU intensive, so it would still be a nice upgrade for you.
  • Jamahl - Thursday, August 16, 2012 - link

    Do you realise that the majority of 660 Ti's being benchmarked at other techsites are overclocked vs the stock Radeons?
  • Biorganic - Thursday, August 16, 2012 - link

    Exactly this. Anyone who follows these respective cards, 7950:670, 7970:680 etc knows that the AMD alternatives have excellent overclocking potential. All these reviews are comparing high clocked GTX vs stock or very conservatively boosted AMD cards. I can get my 7950 to 1000 mHz on stock voltage. That will destroy this toy they call a TI. Sorry but the results seem a bit biased.
  • Ryan Smith - Thursday, August 16, 2012 - link

    "Sorry but the results seem a bit biased."

    Just so we're clear, are you talking about our article, or articles on other sites?

    if it's the former, in case you've missed it we are explicitly testing a reference clocked GTX 660 Ti in the form of Zotac's card at reference clocks (this is hardware identical to their official reference clocked model).
  • mwildtech - Thursday, August 16, 2012 - link

    Biased?? This guy is an idiot. Anandtech is the least biased tech site on the interwebs. Ryan - awesome review! keep up the good work.

Log in

Don't have an account? Sign up now