Final Words

With 3 major launches in under 3 months it seems like I’ve written he same thing time and time again, and that wouldn’t be an incorrect observation. By being the first to deploy 28nm GPUs AMD has been enjoying a multi-month lead on NVIDIA that has allowed them to set their own pace, and there’s little NVIDIA can do but sit back and watch. Consequently we’re seeing AMD roll out a well-orchestrated launch plan unhindered, with AMD launching each new Southern Islands card at exactly the place they’ve intended to from the beginning.

At each launch AMD has undercut NVIDIA at critical points, allowing them to push NVIDIA out of the picture, and the launch of the Radeon HD 7800 series is no different. AMD’s decision to launch the 7870 and 7850 at roughly $25 to $50 over the GTX 570 and GTX 560 Ti respectively means that NVIDIA’s cards still have a niche between AMD’s price points for the time being, but this is effectively a temporary situation as NVIDIA starts drawing down inventory for the eventual Kepler launch.

Starting with the Radeon HD 7870 GHz Edition, AMD is effectively in the clear for the time being. At roughly 9% faster than the GTX 570 there’s little reason to get the GTX 570 even with the 7870’s price premium; it’s that much faster, cooler, and quieter. With the launch of Pitcairn and the 7870 in particular, GF110 has effectively been removed from competition after a nearly year and a half run.

As for the Radeon HD 7850, things are not so clearly in AMD’s favor. From a power perspective it's by far the fastest 150W card you can buy, and that alone will earn AMD some major OEM wins along with some fans in the SFF PC space. Otherwise from a price perspective it’s certainly the best $250 card you can buy, but then that’s the catch: it’s a $250 card. With GTX 560 Ti prices starting to drop below $200 after rebate, the 7850 is nearly $50 more expensive than the GTX 560 Ti. At the same time its performance is only ahead of the GTX 560 Ti by about 9% on average, and in the process it loses to the GTX 560 Ti at a couple of games, most importantly Battlefield 3 by about 8%. AMD has a power consumption lead to go along with that performance lead, but without retail cards to test it’s not clear whether that translates into any kind of noise improvements over the GTX 560 Ti. In the long run the 7850 is going to be the better buy – in particular because of its additional RAM in the face of increasingly VRAM-hungry games – but $199 for a GTX 560 Ti is going to be hard to pass up while it lasts.

Of course by being in the driver’s seat overall when it comes to setting video card prices AMD has continued to stick to their conservative pricing, both to their benefit and detriment. The 7800 series isn’t really any cheaper than the 6900 series it replaces; in fact it’s probably a bit more expensive after you factor in the rebates that have been running on the 6900 series since last summer. But these prices stop the bleeding from what has been an aggressive price war between the two companies over the last 3 years, which is going to be of great importance to AMD in the long run.

Nevertheless we’re largely in the same situation now as where we were with the 7700 series: AMD has only moved a small distance along the price/performance curve with the 7800 series, and they’re in no particular hurry to change that. But if nothing else, on the product execution side of things AMD has done a much better job, getting their old cards out of the market well ahead of time in order to keep from having to compete with themselves. As a result your choices right now at $200+ are the 7800 and 7900 series, or last-generation Fermi cards. Otherwise we’re in a holding pattern until AMD brings prices down, which considering Pitcairn is the replacement for the Barts-based 6800, could potentially be quite a reduction in the long run.

Wrapping things up, at this point in time AMD has taken firm control of the $200+ video card market. The only real question is this: for how long? AMD enjoyed a nearly 6 month lead over NVIDIA when rolling out the first generation of 40nm DX11 cards, but will they enjoy a similarly long lead with the first generation of 28nm cards? Only time will tell.

Overclocking: Gaming & Compute Performance
POST A COMMENT

173 Comments

View All Comments

  • mak360 - Monday, March 05, 2012 - link

    Enjoy, now go and buy Reply
  • ImSpartacus - Monday, March 05, 2012 - link

    Yeah, I'm trying to figure out if a 7850 could go in an Alienware X51. It looks like it uses a 6 pin power connector and puts out 150W of heat.

    While we would lose Optimus, would it work?
    Reply
  • taltamir - Monday, March 05, 2012 - link

    optimus is laptops only. You do not have optimus with your desktop. Reply
  • ImSpartacus - Monday, March 05, 2012 - link

    The X51 has desktop Optimus.

    "The icing on the graphics cake is that the X51 is the first instance of desktop Optimus we've seen. That's right: you can actually plug your monitor into the IGP's HDMI port and the tower will power down the GPU when it's not in use. This implementation functions just like the notebook version does, and it's a welcome addition."

    http://www.anandtech.com/show/5543/alienware-x51-t...

    In reality, if I owned an X51, I would wait so I could shove the biggest 150W Kepler GPU in there for some real gaming.

    But I'm sure the X51 will be updated for Kepler and Ivy Bridge, so now wouldn't be the best time to get an X51.

    Waiting games are lame...
    Reply
  • scook9 - Monday, March 05, 2012 - link

    Wrong. Read a review.....The bigger issue will be the orientation of the PCIe Power Connector I expect. I have a tightly spaced HTPC that currently uses a GTX 570 HD from EVGA because it was the best card I could fit in the Antec Fusion enclosure. If the PCIe power plugs were facing out the side of the card and not the back I would have not been able to use it. I expect the same consideration will apply to the even smaller X51 Reply
  • kungfujedis - Monday, March 05, 2012 - link

    he does. x51 is a desktop with optimus.

    http://www.theverge.com/2012/2/3/2768359/alienware...
    Reply
  • Samus - Monday, March 05, 2012 - link

    EA really screwed AMD over with Battlefield 3. There's basically no reason to consider a Radeon card if you plan on heavily playing BF3, especially since most other games like Skyrim, Star Wars, Rage, etc, all run excellent on any $200+ card, with anything $300+ being simply overkill.

    The obvious best card for Battlefield 3 is a Geforce GTX 560 TI 448 Cores for $250-$280, basically identical in performance to the GTX570 in BF3. Even those on a budget would be better served with a low-end GTX560 series card unless you run resolutions above 1920x1200.

    If I were AMD, I'd concentrate on increasing Battlefield 3 performance with driver tweaks, because it's obvious their architecture is superior to nVidia's, but these 'exclusive' titles are tainted.
    Reply
  • kn00tcn - Monday, March 05, 2012 - link

    screwed how? only the 7850 is slightly lagging behind, & historically BC2 was consistently a little faster on nv

    also BF3 has a large consistent boost since feb14 drivers (there was another boost sometime in december, benchmark3d should have the info for both)
    Reply
  • chizow - Tuesday, March 06, 2012 - link

    @ Samus

    BF3 isn't an Nvidia "exclusive", they made sure to remain vendor agnostic and participate in both IHV's vendor programs. No pointing the finger and crying foul on this game, it just runs better on Nvidia hardware but I do agree it should be running better than it does on this new gen of AMD hardware.

    http://www.amd4u.com/freebattlefield3/
    http://sites.amd.com/us/game/games/Pages/battlefie...
    Reply
  • CeriseCogburn - Monday, March 26, 2012 - link

    In the reviews here SHOGUN 2 total war is said to be the very hardest on hardware, and Nvidia wins that - all the way to the top.
    --
    So on the most difficult game, Nvidia wins.
    Certainly other factors are at play on these amd favored games like C1 and M2033 and other amd optimized games.
    --
    Once again, on the MOST DIFFICULT to render Nvidia has won.
    Reply

Log in

Don't have an account? Sign up now