Meet the 6970 & 6950

Now that we’ve finally looked at what makes the 6900 series tick, let’s look at the cards themselves.

If you’re familiar with the 6800 series, then the 6900 series is nearly identical. For our reference cards AMD is using the same style they used for the 6800 cards, utilizing a completely shrouded and squared off design. Furthermore unlike the 5800 series AMD is utilizing the same cooler/PCB/layout for both the 6970 and 6950, meaning virtually everything we have to say about one card applies to the other as well. In this case we’ll be using the 6970 as our point of reference.


Top: 5870. Bottom: 6970

Starting with the length, the 6970 measures a hair over 10.5”, giving it the same length as the 5870. Buyers looking for a 5850-like shorter card will have to look elsewhere else for the moment, as the 6950 is the same 10.5”. Power is provided by a set of 6+8pin PCIe power sockets at the top of the card, necessary as the 6970’s 250W TDP is in excess of the 225W 6+6 limit. The 6950 on the other hand does use 6+6 PCIe power sockets in the same location, afforded by its lower 200W TDP.

Cracking open the 6970 we find the PCB with the Cayman GPU at the center in all its 389mm2 glory. Around it are 8 2Gb Hynix GDDR5 chips, rated for 6Gbps, 0.5Gbps higher than what the card actually runs at. As we’ve said before the hardest part about using GDDR5 at high speeds is the complexity of building a good memory bus, and this continues to be the case here. AMD has made progress on getting GDDR5 speeds up to 5.5Gbps primarily through better PCB designs, but it looks like hitting 6Gbps and beyond is going to be impractical, at least for a 256bit bus design. Ultimately GDDR5 was supposed to top out at 7Gbps, but with the troubles both AMD and NVIDIA have had, we don’t expect anyone will ever reach it.

Moving on to the cooling apparatus, vapor chamber coolers are clearly in vogue this year. AMD already used a vapor chamber last year on the dual-GPU 5970, while this year both AMD and NVIDIA are using them on their high-end single-GPU products. Compared to a more traditional heatpipe cooler, a vapor chamber cooler is both more efficient than a heatpipe cooler and easier to build in to a design as there’s no need to worry about where to route the heatpipes. Meanwhile airflow is provided by a blower at the rear of the card; compared to the 5870 the blower on the 6970 is just a bit bigger, a fair consideration given that the 6970 is a hotter card. Interestingly in spite of the higher TDP AMD has still been able to hold on to the half-height exhaust port at the front of the card.

As for I/O we’re looking at AMD’s new port layout as seen on the 6800 series: 2x DVI, 1x HDMI 1.4, and 2x mini-DP. All together the 6970 can drive up to 6 monitors through the use of the mini-DP ports and a MST hub. Compared to the 5800 series the DVI-type ports have a few more restrictions however; along with the usual limitation of only being able to drive 2 DVI-type monitors at once, AMD has reduced the 2nd DVI port to a single-link port (although it maintains the dual-link pin configuration), so you won’t be able to drive 2 2560 or 3D monitors using DVI ports.

Elsewhere the card features 2 CrossFire connectors at the top, allowing for tri-CF for the particularly rich and crazy. Next to the CF connectors you’ll find AMD’s not-so-secret switch, which controls the cards’ switchable BIOSes. The card has 2 BIOSes, which can be changed with the flick of a switch. The primary purpose of this switch is to offer a backup BIOS in case of a failed BIOS flash, as it’s possible to boot the card with the secondary BIOS and then switch back to the primary BIOS after the computer has started in order to reflash it. Normally AMD doesn’t strike us as very supportive of BIOS flashing, so this is an interesting change.


The BIOS Switch

Like the 5870 the back side is covered with a metal plate, and while there aren’t any components on the back side of the card to protect, this is a nice touch by making it easier to grab the card without needing to worry about coming in contact with a pointy contact.

Finally, while the card’s overall dimensions are practically identical to the 5870, we noticed that the boxy design isn’t doing AMD any favors when it comes to CrossFire mode with 2 cards right next to each other. The 5870’s shroud actually jutted out just a bit at the center, keeping the ventilation hole for the blower from pressing right up against the back of another card. The 6970 does not have this luxury, meaning it’s possible to practically seal the upper card depending on how you screw the cards down. As a result our CF temperatures run high, but not to a troublesome degree. We’d still encourage AMD to take a page from NVIDIA’s book and to bring the shroud in a bit around the blower so that it has more room to breathe, particularly as their TDP is approaching NVIDIA’s. In the meantime we’d definitely suggest spacing your cards apart if you have a motherboard and case that allows it.

Another New Anti-Aliasing Mode: Enhanced Quality AA The Test
Comments Locked

168 Comments

View All Comments

  • Remon - Wednesday, December 15, 2010 - link

    Seriously, are you using 10.10? It's not like the 10.11 have been out for a while. Oh, wait...

    They've been out for almost a month now. I'm not expecting you to use the 10.12, as these were released just 2 days ago, but you can't have an excuse about not using a month old drivers. Testing overclocked Nvidia cards against newly released cards, and now using older drivers. This site get's more biased with each release.
  • cyrusfox - Wednesday, December 15, 2010 - link

    I could be wrong, but 10.11 didn't work with the 6800 series, so I would imagine 10.11 wasn't meant for the 6900 either. If that is the case, it makes total sense why they used 10.10(cause it was the most updated driver available when they reviewed.)

    I am still using 10.10e, thinking about updating to 10.12, but why bother, things are working great at the moment. I'll probably wait for 11. or 11.2.
  • Remon - Wednesday, December 15, 2010 - link

    Nevermind, that's what you get when you read reviews early in the morning. The 10.10e was for the older AMD cards. Still, I can't understand the difference between this review and HardOCP's.
  • flyck - Wednesday, December 15, 2010 - link

    it doesn't. Anand has the same result for 25.. resolutions with max details AA and FSAA.

    Presentation on anand however is more focussed on 16x..10.. resolutions. (last graph) if you look in the first graph you'll notice the 6970/6950 performs like HardOcp. e.g. the higher the quality the smaller the gap becomes between 6950 and 570 and 6970 and 580. the lower the more 580 is running away and 6970/6950 are trailing the 570.
  • Gonemad - Wednesday, December 15, 2010 - link

    Oookay, new card from the red competitor. Welcome aboard.

    But, all of this time, I had to ask: why is Crysis is so punitive on the graphics cards? I mean, it was released eons ago, and still can't be run with everything cranked up in a single card, if you want 60fps...

    Is it sloppy coding? Does the game *really* looks better with all the eye candy? Or they built a "FPS bug" on purpose, some method of coding that was sure to torture any hardware that would be built in the next 18 months after release?

    I will get slammed for this, but for instance, the water effects on Half Life 2 look great even on lower spec cards, once you turn all the eye-candy on, and the FPS doesn't drop that much. The same for some subtle HDR effects.

    I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up.

    Another one is Dirt 2, I played it with all the eye candy to the top, my 5870 dropped to 50-ish FPS (as per benchmarks),it could be noticed eventually. I turned one or two things off, checked if they were not missing after another run, and the in game FPS meter jumped to 70. Yay.
  • BrightCandle - Wednesday, December 15, 2010 - link

    Crysis really does have some fabulous graphics. The amount of foliage in the forests is very high. Crysis kills cards because it really does push current hardware.

    I've got Dirt 2 and its not close in the level of detail. Its a decent looking game at times but its not a scratch on Crysis for the amount of stuff on screen. Half life 2 is also not bad looking but it still doesn't have the same amount of detail. The water might look good but its not as good as a PC game can look.

    You should buy Crysis, its £9.99 on steam. Its not a good game IMO but it sure is pretty.
  • fausto412 - Wednesday, December 15, 2010 - link

    yes...it's not much of a fun game but damn it is pretty
  • AnnihilatorX - Wednesday, December 15, 2010 - link

    Well original Crysis did push things too far and optimization could be used. Crysis Warhead is much better optimized while giving pretty identical visuals.
  • fausto412 - Wednesday, December 15, 2010 - link

    "I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up."

    that's probably a good idea. Crysis was made with future hardware in mind. It's like a freaking tech demo. Ahead of it's time and beaaaaaautiful. check it out on max settings,...then come back tell us what you think.
  • TimoKyyro - Wednesday, December 15, 2010 - link

    Thank you for the SmallLuxGPU test. That really made me decide to get this card. I make 3D animations with Blender in Ubuntu so the only thing holding me back is the driver support. Do these cards work in Ubuntu? Is it possible for you to test if the Linux drivers work at the time?

Log in

Don't have an account? Sign up now