Final Words

What a long, strange journey it has been to this point. We have a very delayed launch from AMD that features a part that consumes quite a bit of power and doesn't compete with the competition's high end offering. At face value, this sounds quite a bit like NVIDIA's NV30 launch, but thankfully we wouldn't go so far as to call this NV30 Part 2: the R600 Story.

Even though AMD has not built a high end part, they have built a part that runs very consistently at its performance target (which could not be said about NV30). AMD is also not trying to pass this card off as something it's not: rather than price this card out of its class, the R600 will find a good home at a reasonable price.

Despite the delays, despite the quirks, and despite the lack of performance leadership, AMD has built a good part. It might not be as exciting as an ultra high end card, and it certainly isn't as power efficient as an 8800 GTX or Ultra, but it has quite a few positives that make it an interesting product, and more competition is always a good thing. The worst thing that could happen now is for NVIDIA to get as complacent as ATI did after R300 wiped the floor with the competition.

Let's break it down with something akin to a pro/con list. Here's what AMD did right:

R600 features a tessellator which offers an interesting option to geeks and game developers even if it doesn't offer a lot of value to the average consumer. We've got full HD video decode acceleration for all the major codecs. There is a huge amount of processing power available for the code and data that fits the structure of the hardware. Audio is integrated into the video stream and sent out over HDMI with a special adapter allowing both DVI and HDMI to coexist and without the need of splitting the audio channel out from elsewhere. We like to see more options for antialiasing, and even if we don't necessarily like the tent filters the edge detect AA is a really cool concept that looks pretty good. And we absolutely love the architectural detail AMD has gone into with R600.

And here's what AMD did wrong:

First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses. Forcing MSAA resolve to run on the shader hardware is less than desirable and degrades both pixel throughput and shader horsepower as opposed to implementing dedicated resolve hardware in the render back ends. Not being able to follow through with high end hardware will hurt in more than just in lost margins. The thirst for wattage that the R600 displays is not what we'd like to see from an architecture that is supposed to be about efficiency. Finally, attempting to extract a high instruction level parallelism using a VLIW design when something much simpler could exploit the huge amount of thread level parallelism inherent in graphics was not the right move.

Maybe that's a lot to digest, but the bottom line is that R600 is not perfect nor is it a failure. The HD 2900 XT competes well with the 640MB 8800 GTS, though the 8800 GTS 320MB does have a price/performance advantage over both in all but the highest resolutions and AA settings under most current games. There are features we like about the hardware and we would love to see exploited. There is potential there, especially for Xbox 360 ports, to really shine... though console ports are often looked down upon in the PC market, particularly if they come late and offer little new to the platform.

Another bit question is that we still haven't seen how either G80 or R600 handle DX10 based games. This unknown will continue for just a little while longer, as next month we should start seeing some titles support DX10. The first titles may not be representative of later DX10 titles, however, so this is something we will only be able to properly assess with time.

For now, R600 is a good starting place for AMD's DX10 initiative, and with a bit of evolution to their unified shader hardware it could eventually rise to the top. We aren't as excited about this hardware as we were about G80, and there are some drawbacks to AMD's implementation, but we certainly won't count them out of the fight. Power efficiency on 65nm remains to be seen, and there is currently a huge performance gap NVIDIA has left between the 8600 GTS and the 8800 GTS 320MB. If AMD is able to capitalize here with the HD 2600 series, they will certainly still have a leg to stand on. We will have to wait to see those performance results though.

In the meantime, we are just happy that R600 is finally here after such a long wait. Let's hope for AMD's sake that the next revision of their hardware doesn't take quite so long to surface and manages to compete better with six month old competing products. We certainly hope we won't see a repeat of the R600 launch when Barcelona and Agena take on Core 2 Duo/Quad in a few months....

Power Consumption
Comments Locked

86 Comments

View All Comments

  • imaheadcase - Tuesday, May 15, 2007 - link

    quote:

    Bad performance with AA turned on (everybody turns on AA), huge power consumption, late to the market.


    Says who? Most people I know don't care to turn on AA since they visually can't see a difference. Only people who are picky about everything they see do normally, the majority of people don't notice "jaggies" since the brain fixes it for you when you play.
  • Roy2001 - Tuesday, May 15, 2007 - link

    Says who? Most people I know don't care to turn on AA since they visually can't see a difference.
    ------------------------------------------
    Wow, I never turn it of once I am used to have AA. I cannot play games anymore without AA.
  • Amuro - Tuesday, May 15, 2007 - link

    quote:

    the majority of people don't notice "jaggies" since the brain fixes it for you when you play.

    Says who? No one spent $400 on a video card would turn off AA.
  • SiliconDoc - Wednesday, July 8, 2009 - link

    Boy we'd sure love to hear those red fans claiming they turn off AA nowadays and it doesn't matter.
    LOL
    It's just amazing how thick it gets.
  • imaheadcase - Tuesday, May 15, 2007 - link

    quote:

    Says who? No one spent $400 on a video card would turn off AA.


    Sure they do, because its a small "tweak" with a performance hit. I say who spends $400 on a video card to remove "jaggies" when they are not noticeable in the first place to most people. Same reason most people don't go for SLI or Crossfire, because it really in the end offers nothing substantial for most people who play games.

    Some might like it, but they would not miss it if they stopped using it for some time. Its not like its make or break feature of a video card.
  • motiv8 - Tuesday, May 15, 2007 - link

    Depends on the game or player tbh.

    I play within ladders without AA turned on, but for games like oblivion I would use AA. Depends on your needs at the time.

Log in

Don't have an account? Sign up now