• What
    is this?

    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.

    PRESENTED BY

The AMD Radeon HD 6990, otherwise known as Antilles, is a card we have been expecting for some time now. In what’s become a normal AMD fashion, when they first introduced the Radeon HD 6800 series back in October, they also provided a rough timeline for the rest of the high-end members of the family. Barts would be followed by Cayman (6950/6970), which would be followed by the dual-GPU Antilles (6990).

AMD’s original launch schedule at the time was to have the whole stack out the door by the end of 2010 – Antilles would be the last product, likely to catch Christmas before it was too late. What ended up happening however is that Cayman didn’t make it out until the middle of December, which put those original plans on ice. So we ended up closing the year with the 6800 series and the single-GPU members of the 6900 series, but AMD did not launch a replacement for their flagship dual-GPU card, leaving AMD’s product stack in an odd place where their top card was a 5000 series card compared to the 6000 series occupying everything else.

So while we’ve had to wait longer than we anticipated for Antilles/6990, the wait has finally come to an end. Today AMD is launching their new flagship card, retiring the now venerable 5970 and replacing it with a new dual-GPU monster powered by AMD’s recently introduced VLIW4 design. Manufactured on the same 40nm process as the GPUs in the 5970, AMD has had to go to some interesting lengths to improve performance here. And as we’ll see, it’s going to be a doozy in more ways than one.

  AMD Radeon HD 6990 AMD Radeon HD 6970 AMD Radeon HD 6950 AMD Radeon HD 5970
Stream Processors 2x1536 1536 1408 2x1600
Texture Units 2x96 96 88 2x80
ROPs 2x32 32 32 2x32
Core Clock 830MHz 880MHz 800MHz 725MHz
Memory Clock 1.25GHz (5.0GHz data rate) GDDR5 1.375GHz (5.5GHz data rate) GDDR5 1.25GHz (5.0GHz data rate) GDDR5 1.GHz (4GHz data rate) GDDR5
Memory Bus Width 2x 256-bit 256-bit 256-bit 2x256-bit
Frame Buffer 2x2GB 2GB 2GB 2x1GB
FP64 1/4 1/4 1/4 1/5
Transistor Count 2x 2.64B 2.64B 2.64B 2x2.15B
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm TSMC 40nm
Price Point $699 $349 $259 N/A

For the Radeon HD 5970, AMD found themselves in an interesting position: with the 5000 series launching roughly 6 months ahead of NVIDIA’s 400 series of GPUs, they already had a lead in getting products out the door. But furthermore NVIDIA never completely responded to the 5970, foregoing dual-GPU entirely with the 400 series. The 5970 was undisputed king of video cards – no single card was more powerful. Thus given a lack of direct competition, how AMD can follow up on the 5970 is a matter of great interest.

But before we get too far ahead of ourselves, let’s start with the basics. The Radeon HD 6990 is AMD’s new flagship card, based on a pair of Cayman (VLIW4) GPUs mounted on a single PCB. AMD has clocked the GPU at 830MHz and the GDDR5 memory at 1250MHz (5GHz data rate). The card comes with 4GB of RAM, which due to the internal CrossFire setup of the card reduces the effective RAM capacity to 2GB, the same as AMD’s existing 6900 cards.

Starting with the 5970, TDP limits and the laws of physics began limiting what AMD could do with a dual-GPU card; unlike the 4870X2, the 5970 wasn’t clocked quite high enough to match a pair of 5870s. The delta between the 5970 and the 5870 came down to the 5970 being 125MHz slower on the core and 200MHz (800Mhz data rate) slower for its RAM. In practice this reduced 5970 performance to near-5850CF levels. For the 6990 this gap still exists, but it’s much smaller this time. At 830MHz the 6990 is only 50MHz (5.5%) slower than the 6970, while the 5GHz memory takes a bigger hit as it’s 500MHz (9%) slower than the 6970. As a result at stock settings the 6990 is closer to being a dual-GPU 6970 than the 5970 was a dual-GPU 5870; there is one exception we will see however. Meanwhile the 6990’s GPUs are fully enabled, so all 1536 SPs and 32 ROPs per GPU are available, making the only difference between the 6990 and 6970 the clockspeeds.

Compared to the 5970, the official idle TDP is down some thanks to Cayman’s better power management, leading to an idle TDP of 37W. Meanwhile under load we find our first doozy: the card’s TDP at default clocks is 375W (this is not a typo), and like the 5970 AMD has built it to take even more. Whereas the 5970 stayed within PCI-Express specifications at default clocks, the 6990 makes no attempt to do so, and as such at 375W is the most power hungry card to date.

AMD will be launching the 6990 at $699. Officially this is $100 more expensive than the 5970 at its launch, however the 5970 was virtually never available at this price until very late in the card’s lifetime. $700 does end up being much closer to both the 5970’s historical price and its price relative to AMD’s top single-GPU part (5870), which was $700 and approximately twice the cost respectively. With a more stable supply of GPUs and stronger pressure from NVIDIA we’d expect prices to stick closer to their MSRP this time around, but at the top there’s not a lot of pressure to keep prices from rising. Meanwhile AMD has not provided any hard numbers for availability, but $700 cards are not high volume products. We’d expect availability to be a non-issue.

With the launch of the 6990 AMD’s high-end product stack is fully fleshed out. At the top will be the 6990, followed by the 6970, the 6950 2GB, and the 6950 1GB. The astute among you will notice that the average price of the 6970 is less than half that of the 6990, and as a result a 6970 CrossFire setup is cheaper than the 6990. At the lowest price we’ve seen for the 6970, we could pick up 2 of them for $640, which will put the 6990 in an interesting predicament of being a bit more expensive and a bit slower than the 6970 in CrossFire.

March 2011 Video Card MSRPs
NVIDIA Price AMD
  $700 Radeon HD 6990
$480  
$350  
  $320-$340 Radeon HD 6970
  $249-269 Radeon HD 6950 2GB
 
$230-$250 Radeon HD 6950 1GB
GeForce GTX 560 Ti
$249  
  $219 Radeon HD 6870
$160-170 Radeon HD 6850

 

Meet The 6990
POST A COMMENT

130 Comments

View All Comments

  • nafhan - Tuesday, March 08, 2011 - link

    I generally buy cards in the $100-$200 range. Power usage has gone up a bit while performance has increased exponentially over the last 10 years. Reply
  • LtGoonRush - Tuesday, March 08, 2011 - link

    I'm disappointed at the choices AMD made with the cooler. The noise levels are truly intolerable, it seems like it would have made more sense to go with a triple-slot card that would be more capable of handling the heat without painful levels of noise. It'll be interesting to see how the aftermarket cooler vendors like Arctic Cooling and Thermalright handle this. Reply
  • Ryan Smith - Tuesday, March 08, 2011 - link

    There's actually a good reason for that. I don't believe I mentioned this in the article, but AMD is STRONGLY suggesting not to put a card next to the 6990. It moves so much air that another card blocking its airflow would run the significant risk of killing it.

    What does this have to do with triple-slot coolers? By leaving a space open, it's already taking up 3 spaces. If the cooler itself takes up 3 spaces, those 3 spaces + 1 open space is now 4 spaces. You'd be hard pressed to find a suitable ATX board and case that could house a pair of these cards in Crossfire if you needed 8 open spaces. Triple slot coolers are effectively the kryptonite for SLI/CF, which is why NVIDIA isn't in favor of them either (but that's a story for another time).
    Reply
  • arkcom - Tuesday, March 08, 2011 - link

    2.5 slot cooler. That would guarantee at least half a slot is left for airspace. Reply
  • Quidam67 - Tuesday, March 08, 2011 - link

    if it means a quieter card then that might have been a compromise worth making. Also, 2.5 would stop people from making the il-advised choice of using the slot next to the card, thus possibly killing it! Reply
  • strikeback03 - Tuesday, March 08, 2011 - link

    With the height of a triple slot card maybe they could mount the fan on an angle to prevent blocking it off. Reply
  • kilkennycat - Tuesday, March 08, 2011 - link

    Triple-slot coolers... no need!!

    However, if one is even contemplating Crossfire or SLI then a triple-slot space between the PCIe X16 SOCKETS for a pair of high-power 2-slot-cooler graphics cards with "open-fan" cooling (like the 6990) is recommended to avoid one card being fried by lack of air. This socket-spacing allows a one-slot clear air-space for the "rear" card's intake fan to "breathe". (Obviously, one must not plug any other card into any motherboard socket present in this slot)

    In the case of a pair of 6990 (or a pair of nVidia's upcoming dual-GPU card), a minimum one-slot air-space between cards becomes MANDATORY, unless custom water or cryo cooling is installed.

    Very few current Crossfire/SLI-compatible motherboards have triple-slot (or more) spaces between the two PCIe X16 connectors while simultaneously also having genuine X16 data-paths to both connectors. That socket spacing is becoming more common with high-end Sandy-Bridge motherboards, but functionality may still may be constrained by X8 PCIe data-paths at the primary pair of X16 connectors.

    To even attempt to satisfy the data demands of a pair of 6990 Cross-Fire with a SINGLE physical CPU, you really do need a X58 motherboard and a Gulftown Corei7 990x processor, or maybe a Corei7 970 heavily overclocked. For X58 motherboards with triple-spaced PCIe sockets properly suitable for Crossfire or SLI , you need to look at the Asrock X58 "Extreme" series of motherboards. These do indeed allow full X16 data-paths to the two primary PCIe X16 "triple-spaced" sockets.

    Many ATX motherboards have a third "so-called" PCIe X16 socket in the "slot7" position. However, this slot is always incapable of a genuine X16 pairing with either of the other two "X16" sockets, Anyway this "slot 7" location will not allow any more than a two-slot wide card when the motherboard is installed in a PC "tower" -- an open-fan graphics card will have no proper ventilation here, as it comes right up against either the power-supply (if bottom-loaded) or the bottom-plate of the case.
    Reply
  • Spazweasel - Tuesday, March 08, 2011 - link

    Exactly. For people who are going to do quad-Crossfire with these, you pretty much have to add the cost of a liquid cooling system to the price of the cards, and it's going to have to be a pretty studly liquid cooler too. Of course, the kind of person who "needs" (funny, using that word!) two of these is also probably the kind of person who would do the work to implement a liquid cooling system, so that may be less of an issue than it otherwise might be.

    So, here's the question (more rhetorical than anything else). For a given ultra-high-end gaming goal, say, Crysis @ max settings, 60fps @ 3x 2500x1600 monitors (something that would require quad Crossfire 69xx or 3-way SLI 580), with a targeted max temperature and noise level... which is the cheaper solution by the time you take into account cooling, case, high-end motherboard, the cards themselves? That's the cost-comparison that needs to be made, not just the cost of the cards themselves.
    Reply
  • tzhu07 - Tuesday, March 08, 2011 - link

    Before anyone thinks of buying this card stock, you should really go out and get a sense of what that kind of noise level is like. Unless you have a pair of high quality expensive noise-cancelling earbuds and you're playing games at a loud volume, you're going to constantly hear the fan.

    $700 isn't the real price. Add on some aftermarket cooling and that's how much you're going to spend.

    Don't wake the neighbors...
    Reply
  • Spivonious - Tuesday, March 08, 2011 - link

    70dB is the maximum volume before risk of hearing loss, according to the EPA. http://www.epa.gov/history/topics/noise/01.htm

    Seriously, AMD, it's time to look at getting more performance per Watt.
    Reply

Log in

Don't have an account? Sign up now