Final Words

With 3 major launches in under 3 months it seems like I’ve written he same thing time and time again, and that wouldn’t be an incorrect observation. By being the first to deploy 28nm GPUs AMD has been enjoying a multi-month lead on NVIDIA that has allowed them to set their own pace, and there’s little NVIDIA can do but sit back and watch. Consequently we’re seeing AMD roll out a well-orchestrated launch plan unhindered, with AMD launching each new Southern Islands card at exactly the place they’ve intended to from the beginning.

At each launch AMD has undercut NVIDIA at critical points, allowing them to push NVIDIA out of the picture, and the launch of the Radeon HD 7800 series is no different. AMD’s decision to launch the 7870 and 7850 at roughly $25 to $50 over the GTX 570 and GTX 560 Ti respectively means that NVIDIA’s cards still have a niche between AMD’s price points for the time being, but this is effectively a temporary situation as NVIDIA starts drawing down inventory for the eventual Kepler launch.

Starting with the Radeon HD 7870 GHz Edition, AMD is effectively in the clear for the time being. At roughly 9% faster than the GTX 570 there’s little reason to get the GTX 570 even with the 7870’s price premium; it’s that much faster, cooler, and quieter. With the launch of Pitcairn and the 7870 in particular, GF110 has effectively been removed from competition after a nearly year and a half run.

As for the Radeon HD 7850, things are not so clearly in AMD’s favor. From a power perspective it's by far the fastest 150W card you can buy, and that alone will earn AMD some major OEM wins along with some fans in the SFF PC space. Otherwise from a price perspective it’s certainly the best $250 card you can buy, but then that’s the catch: it’s a $250 card. With GTX 560 Ti prices starting to drop below $200 after rebate, the 7850 is nearly $50 more expensive than the GTX 560 Ti. At the same time its performance is only ahead of the GTX 560 Ti by about 9% on average, and in the process it loses to the GTX 560 Ti at a couple of games, most importantly Battlefield 3 by about 8%. AMD has a power consumption lead to go along with that performance lead, but without retail cards to test it’s not clear whether that translates into any kind of noise improvements over the GTX 560 Ti. In the long run the 7850 is going to be the better buy – in particular because of its additional RAM in the face of increasingly VRAM-hungry games – but $199 for a GTX 560 Ti is going to be hard to pass up while it lasts.

Of course by being in the driver’s seat overall when it comes to setting video card prices AMD has continued to stick to their conservative pricing, both to their benefit and detriment. The 7800 series isn’t really any cheaper than the 6900 series it replaces; in fact it’s probably a bit more expensive after you factor in the rebates that have been running on the 6900 series since last summer. But these prices stop the bleeding from what has been an aggressive price war between the two companies over the last 3 years, which is going to be of great importance to AMD in the long run.

Nevertheless we’re largely in the same situation now as where we were with the 7700 series: AMD has only moved a small distance along the price/performance curve with the 7800 series, and they’re in no particular hurry to change that. But if nothing else, on the product execution side of things AMD has done a much better job, getting their old cards out of the market well ahead of time in order to keep from having to compete with themselves. As a result your choices right now at $200+ are the 7800 and 7900 series, or last-generation Fermi cards. Otherwise we’re in a holding pattern until AMD brings prices down, which considering Pitcairn is the replacement for the Barts-based 6800, could potentially be quite a reduction in the long run.

Wrapping things up, at this point in time AMD has taken firm control of the $200+ video card market. The only real question is this: for how long? AMD enjoyed a nearly 6 month lead over NVIDIA when rolling out the first generation of 40nm DX11 cards, but will they enjoy a similarly long lead with the first generation of 28nm cards? Only time will tell.

Overclocking: Gaming & Compute Performance
POST A COMMENT

173 Comments

View All Comments

  • Kaboose - Monday, March 05, 2012 - link

    7870, beats GTX 570 and is about even with the GTX 580, uses 150watts less power at load, is quieter, is cooler, and has idle power draw > 23watts less. How is this a disappointment? The only disappointment i see is the price which is the result of no competition from Nvidia. Reply
  • Kiste - Monday, March 05, 2012 - link

    A new generation of GPUs used to give us a whole hell of a lot more performance at any given price point. The current AMD stuff does not and that is a disappointment.

    Case in point: you even have to talk these things up by basically saying "oh, well, at least they draw less power".
    Reply
  • Kaboose - Monday, March 05, 2012 - link

    dropping power consumption by over 50% is something of a gimmick? Dropping load temps by 14c compared to the GTX 570 is not significant? 14c is a fairly large degree of deference, this gives higher room for overclocking as well as a cooler system overall. When Nvidia releases Kepler and we have both companies with 28nm then we can (hopefully) see some competition in price. In my opinion the 7870 at $325 would be a great card right now. Once Kepler is out $285-300 I think would be nice. I agree it is over priced right now however.

    If Nvidia releases Kepler and gives us a LOT more performance over last generation then I will concede that the 7xxx series is a failure. However from the way AMD is behaving it doesn't appear Kepler is going to do much in terms of raw performance either.
    Reply
  • Kiste - Monday, March 05, 2012 - link

    While reducing power consumption might not be a gimmick, it is the result of the new process node and thus in itself not particularly impressive, especially when you more or less keep the performance the same as with the previous generation.

    I'm still not impressed, sorry. Price/performance plain and simply sucks ass with these cards, barely beating the stuff that's on the market right now in that regard.

    And even with the high-end SI cards there's barely much of a performance boost compared to what's already been on the market for months.

    Sure, less power draw is nice. I won't complain about it but if a brand new generation of GPUs comes out and I am not even one little bit compelled to upgrade from my aging, heavily overclocked GTX570, then something is cleary wrong here.
    Reply
  • Exodite - Monday, March 05, 2012 - link

    We'll just have to wait and see what Nvidia provides when they finally decide to put competition on the market, won't we?

    I'd happily agree to finding the 7900-series not as high-performing as I'd like, and the 7700-series too expensive.

    From the reviews I've read so far the 7800-series, the 7850 especially, is pretty much the perfect card ATM.

    Low power, low noise, cool, 2GB VRAM and runs between a 560 Ti and 570 in performance.

    It's definitely the card I'd recommend to anyone at this point, especially given the fact that we'll see better coolers than AMDs atrocities once we get release versions.
    Reply
  • Iketh - Monday, March 05, 2012 - link

    You must live in a cold climate. You're happy with a heavily overclocked 570?? I live in FL, and that card increases my power bill $30-$70 each month over my 6870 during 3 of the 4 seasons, and I'm talking from experience. Do you have any idea how hard an A/C has to work in a small 2 bedroom house to counter the blast of heat from an overclocked gaming rig??

    If you live in a hot climate, test it for yourself.

    You don't compare just the power draw of the cards themselves....
    Reply
  • Kiste - Monday, March 05, 2012 - link

    I'm not quite sure if you're actually expecting a serious answer to that kind of hyperbolic drivel. Reply
  • Jamahl - Monday, March 05, 2012 - link

    You really don't get it do you? These cards REALLY DO heat up rooms. Where do you think the heat goes? Ever heard of the law of conservation of energy? Reply
  • londiste - Monday, March 05, 2012 - link

    oh damn, i need to get two of those, maybe they'll reduce my heating bill at winter :) Reply
  • Kiste - Monday, March 05, 2012 - link

    Spelling "really do" in capital letter doesn't make it any more less ridiculous a statement.

    My whole PC (GPU, OCed CPU, 4 HDDS) draws slightly more than 300W under typical gaming loads. You can't "heat up a room" with that, much less with just the GPU.
    Reply

Log in

Don't have an account? Sign up now