Super AA Modes

There are some older games that wouldn't see any benefit from a multi-GPU solution, as these titles may not be GPU limited. In order to provide some benefit to these games (while at the same time offering higher image quality), ATI has devised four multi-card display modes. These modes are user selectable from the control panel and can help add smoothness and clarity to any title.

The compatibility of ATI's Super AA modes is not limited to any subset of titles because there is no workload split involved - each card renders the entire scene, each with a unique set of sample points. Before display, the compositing engine takes the output of each card and prepares a final image for display.



Two of the new modes simply make use of different sample points. 8xAA and 12xAA employ either 4x or 6x AA modes on each card. Of course, MSAA is limited in its ability to antialias certain aspects of a 3D scene. Multisample only works along polygon edges, while the slower supersample method works across the entire scene (including textures). SSAA has fallen out of use due to the rather large performance impact that it has on a single card. The modus operandi for SSAA is to render a scene at a higher resolution and then resample the image to the desired resolution. Of course, there are other ways of performing SSAA.



ATI is able to handle SSAA by rendering the entire scene at the desired resolution on each card with a half pixel diagonal shift. They combine this method with either their 8x or 12x MSAA modes in order to produce 10xAA (4x + 4x + 2xSS) and 14xAA (6x + 6x + 2xSS). These quality modes should prove to be phenomenal.

These 2xSS mode shouldn't be confused with a normal 2x vertical and 2x horizontal resolution mode. In that case, each pixel has 4 ordered sample points that scale down to one pixel. In ATI's mode, 2 sample points are used per pixel in a rotated grid fashion.

These modes add life to games that would not benefit otherwise from multiple graphics cards, as well as provide a compatibility mode to titles for which alternating or splitting frames is not an option. This is a key feature of ATI's CrossFire that separates it from NVIDIA, and we are very eager to get our hands on hardware and test it first hand.

Now that we know what ATI's CrossFire solution is and what it can do, let's take a look at how it stacks up to the competition.

Rendering Modes CrossFire vs. SLI
Comments Locked

57 Comments

View All Comments

  • yacoub - Tuesday, May 31, 2005 - link

    Is it just me or do several things about this scream "bottleneck" and "latency"? The 2PCI-E x8 slots instead of x16 slots. The extra Compositing Engine chip. The ability to pair different cards such that it will drop clock speeds and/or pipelines to sync them up. The lack of direct chip-to-chip interconnect.

    I'm curious to know just how much performance gain is realized if you pair, say, an X800XL and an X850-something, over just the X850-something. And also how much bottleneck and latency there is in this implementation over the NVidia offering of SLI.

    The only upside I can see is cost/upgrade since a user can own an X800-based card (assuming they have a Crossfire compatible motherboard) and go out and buy an X850-based card later and use BOTH cards together (assuming they are both Crossfire-capable cards). Then again with those assumptions I'm not sure it's truly any more cost-effective. =\
  • LoneWolf15 - Tuesday, May 31, 2005 - link

    As usual, the fanboys of both sides come to the show to spout their comments.

    For everyone saying "Man, you have to buy a Crossfire that matches your card, and throw it away when you upgrade"...umm, don't you have to buy two of the exact same matching card for running nVidia SLI, and if you wish to upgrade, you have to sell both? Doesn't sound that different to me. One thing I think a lot of current ATI owners will be happy about is that they won't have to get rid of a card they already own and buy two of a new one; they can just buy a single Crossfire card (and of course a mainboard).

    On the other hand, to those thinking ATI has now "0wned" nVidia, it is WAY too early to tell. The solution looks promising, but if you have to sacrifice mainboard performance (i.e., SATA hard disks, memory bandwidth, etc.) it may be a hard sell. Benchmarks in Doom 3 are also not the end-all be-all. We'll have to wait for a more comprehensive performance review, including DirectX benches, and performance/quality with older games using this new AA method, as well as game compatibility reports. We'll also need to know what TRUE pricing is (we've seen claimed pricing vary quite a bit from what it has turned out to be at product release in the past two years).

    Do I hope it will beat nVidia's solution? You bet. I like ATI, but even more I like competition that drives the industry. Do we proclaim ATI the winner/loser on this one? Heck no, it isn't even a purchaseable product yet.
  • ElMoIsEviL - Tuesday, May 31, 2005 - link

    23 - They ran Doom3.

    It's not an ATi game at all as we all know. And it still does REALLY well. And it's not in release stages yet.

    ;)
  • ElMoIsEviL - Tuesday, May 31, 2005 - link

    hehehehe.. it's better then SLi... hehehehe

    Figures, all the NV on here prolly aren't too happy today.

    I can't wait to test out the new AA modes.. :)
  • vertigo1 - Tuesday, May 31, 2005 - link

    This is insane, who on earth will buy this?!
  • JarredWalton - Tuesday, May 31, 2005 - link

    30 - Yes. The PCIe bus likely provides slower performance, as it is used for lots of other things (like communication between the CPU, RAM, and GPUs). I believe NVIDIA SLI works without the dongle but at slower speeds - at least, I heard that somewhere, but I haven't ever had an SLI board so I can't say for sure. Anyway, since DVI is a digital signle, using DVI in/out seems about as good as the SLI bridge - at least in theory. Now we just need to wait and see how theories pan out. :)
  • Jalf - Tuesday, May 31, 2005 - link

    I was under the impression they were going to use the PCI-E bus for transferring data between the cards. Is the external dongle going to handle that instead?
  • Murst - Tuesday, May 31, 2005 - link

    I really don't see how the xfire is better than sli based on hardware compatibility. Sure, you don't need the exact same cards, but you will likely buy only one x850 type card per x850 xfire. It would be extremely unlikely that someone upgrades from x850 xt pro -> x850 xt pe.

    Basically, in the end, you will buy a specific xfire tailored to your gfx card, and throw it away with the next generation of cards.
  • gxsaurav - Tuesday, May 31, 2005 - link

    Great this just means more heat, man, even a single 6800 nU playes everygame fine, while running cool
  • ViRGE - Tuesday, May 31, 2005 - link

    #21, yes it is. This is what hurts ATI the most, Nvidia already had 4 release cycles of experience with motherboards(2 of those being highly popular, highly recommended boards) before attempting SLI. ATI has a previous launch for a board almost universally ignored. I would not use an ATI board at this time, so I would also not consider CrossFire. ATI needs to get CrossFire working on Nvidia's boards to have a fighting chance this round.

Log in

Don't have an account? Sign up now