Final Thoughts

Often it’s not until the last moment that we have all the information in hand to completely analyze a new video card. The Radeon HD 6970 and Radeon HD 6950 were no different. With AMD not releasing the pricing information to the press until Monday afternoon, we had already finished our performance benchmarks before we even knew the price, so much time was spent speculating and agonizing over what route AMD would go. So let’s jump straight in to our recommendations.

Our concern was that AMD would shoot themselves in the foot by pricing the Radeon HD 6970 in particular at too high a price. If we take a straight average at 1920x1200 and 2560x1600, its performance is more or less equal to the GeForce GTX 570. In practice this means that NVIDIA wins a third of our games, AMD wins a third of our games, and they effectively tie on the rest, so the position of the 6970 relative to the GTX 570 is heavily dependent on just what games out of our benchmark suite you favor. All we can say for sure is that on average the two cards are comparable.

So with that in mind a $370 launch price is neither aggressive nor overpriced. Launching at $20 over the GTX 570 isn’t going to start a price war, but it’s also not so expensive to rule the card out. Of the two the 6970 is going to take the edge on power efficiency, but it’s interesting to see just how much NVIDIA and AMD’s power consumption and performance under gaming has converged. It used to be much more lopsided in AMD’s favor.

Meanwhile the Radeon HD 6950 occupies an interesting spot. Above it is the 570/6970, below it are the soon to be discontinued GTX 470 and Radeon HD 5870. These cards were a bit of a spoiler for the GTX 570, and this is once more the case for the 6950. The 6950 is on average 7-10% faster than the 5870 for around 20% more. I am becoming increasingly convinced that more than 1GB of VRAM is necessary for any new cards over $200, but we’re not quite there yet. When the 5870 is done and gone the 6950 will be a reasonable successor, but for the time being the 5870 at $250 currently is a steal of a deal if you don’t need the extra performance or new features like DP1.2. Conversely the 6950 is itself a bit of a spoiler; the 6970 is only 10-15% faster for $70 more. If you had to have a 6900 card, the 6950 is certainly the better deal. Whether you go with the 5870, the 6950, or the 6970, just keep in mind that the 6900 series is in a much better position for future games due to AMD’s new architecture.

And that brings us to the final matter for today, that new architecture. Compared to the launch of Cypress in 2009 the feature set isn’t radically different like it was when AMD first added DirectX 11 support, but Cayman is radically different in its own way. After being carried by their current VLIW5 architecture for nearly four years, AMD is set to hand off their future to their new VLIW4 architecture. It won’t turn the world upside down for AMD or its customers, but it’s a reasonable step forward for the company by reducing their reliance on ILP in favor of more narrow TLP-heavy loads. For gaming this specifically means their hardware should be a better match for future DX10/DX11 games, and the second graphics engine should give them enough tessellation and rasterizing power for the time being.

Longer term we will have to see how AMD’s computing gamble plays out. Though we’ve largely framed Cayman in terms of gaming, to AMD Cayman is first and foremost a compute GPU, in a manner very similar to another company whose compute GPU is also the fastest gaming GPU on the market. Teething issues aside this worked out rather well for NVIDIA, but will lightning strike twice for AMD? The first Cayman-based video cards are launching today, but the Cayman story is just getting started.

Power, Temperature, & Noise
Comments Locked

168 Comments

View All Comments

  • Remon - Wednesday, December 15, 2010 - link

    Seriously, are you using 10.10? It's not like the 10.11 have been out for a while. Oh, wait...

    They've been out for almost a month now. I'm not expecting you to use the 10.12, as these were released just 2 days ago, but you can't have an excuse about not using a month old drivers. Testing overclocked Nvidia cards against newly released cards, and now using older drivers. This site get's more biased with each release.
  • cyrusfox - Wednesday, December 15, 2010 - link

    I could be wrong, but 10.11 didn't work with the 6800 series, so I would imagine 10.11 wasn't meant for the 6900 either. If that is the case, it makes total sense why they used 10.10(cause it was the most updated driver available when they reviewed.)

    I am still using 10.10e, thinking about updating to 10.12, but why bother, things are working great at the moment. I'll probably wait for 11. or 11.2.
  • Remon - Wednesday, December 15, 2010 - link

    Nevermind, that's what you get when you read reviews early in the morning. The 10.10e was for the older AMD cards. Still, I can't understand the difference between this review and HardOCP's.
  • flyck - Wednesday, December 15, 2010 - link

    it doesn't. Anand has the same result for 25.. resolutions with max details AA and FSAA.

    Presentation on anand however is more focussed on 16x..10.. resolutions. (last graph) if you look in the first graph you'll notice the 6970/6950 performs like HardOcp. e.g. the higher the quality the smaller the gap becomes between 6950 and 570 and 6970 and 580. the lower the more 580 is running away and 6970/6950 are trailing the 570.
  • Gonemad - Wednesday, December 15, 2010 - link

    Oookay, new card from the red competitor. Welcome aboard.

    But, all of this time, I had to ask: why is Crysis is so punitive on the graphics cards? I mean, it was released eons ago, and still can't be run with everything cranked up in a single card, if you want 60fps...

    Is it sloppy coding? Does the game *really* looks better with all the eye candy? Or they built a "FPS bug" on purpose, some method of coding that was sure to torture any hardware that would be built in the next 18 months after release?

    I will get slammed for this, but for instance, the water effects on Half Life 2 look great even on lower spec cards, once you turn all the eye-candy on, and the FPS doesn't drop that much. The same for some subtle HDR effects.

    I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up.

    Another one is Dirt 2, I played it with all the eye candy to the top, my 5870 dropped to 50-ish FPS (as per benchmarks),it could be noticed eventually. I turned one or two things off, checked if they were not missing after another run, and the in game FPS meter jumped to 70. Yay.
  • BrightCandle - Wednesday, December 15, 2010 - link

    Crysis really does have some fabulous graphics. The amount of foliage in the forests is very high. Crysis kills cards because it really does push current hardware.

    I've got Dirt 2 and its not close in the level of detail. Its a decent looking game at times but its not a scratch on Crysis for the amount of stuff on screen. Half life 2 is also not bad looking but it still doesn't have the same amount of detail. The water might look good but its not as good as a PC game can look.

    You should buy Crysis, its £9.99 on steam. Its not a good game IMO but it sure is pretty.
  • fausto412 - Wednesday, December 15, 2010 - link

    yes...it's not much of a fun game but damn it is pretty
  • AnnihilatorX - Wednesday, December 15, 2010 - link

    Well original Crysis did push things too far and optimization could be used. Crysis Warhead is much better optimized while giving pretty identical visuals.
  • fausto412 - Wednesday, December 15, 2010 - link

    "I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up."

    that's probably a good idea. Crysis was made with future hardware in mind. It's like a freaking tech demo. Ahead of it's time and beaaaaaautiful. check it out on max settings,...then come back tell us what you think.
  • TimoKyyro - Wednesday, December 15, 2010 - link

    Thank you for the SmallLuxGPU test. That really made me decide to get this card. I make 3D animations with Blender in Ubuntu so the only thing holding me back is the driver support. Do these cards work in Ubuntu? Is it possible for you to test if the Linux drivers work at the time?

Log in

Don't have an account? Sign up now