Final Words

Wrapping things up, I once had someone comment to me that they can gauge my opinion of a product based solely on the first paragraph of the final page. If I say “there’s no such thing as a bad card, only bad prices” then it’s likely not a favorable review. That statement is once more being validated today, if only in a meta context.

To be clear, we’ve been waiting for some time to see GCN filter down to lower priced cards, and even longer to see PowerTune in particular make it down here. The fact that we now have reliable power throttling and solid compute performance is not lost on us. It’s a welcome advancement.

However our expectation with a new manufacturing process – and perhaps we’re being greedy here – is that we’ll see cards become cheaper and we’ll see power consumption come down. AMD has achieved the second item in spades, and as a result both the Radeon HD 7750 and Radeon HD 7770 are well ahead of any competing 75W and 100W cards respectively. The 7750 in particular is a standout thanks to the fact that it generally offers 5700 series performance on a sub-75W card, and even at $109 it clearly offers a great deal of value as an HTPC video card. All of this will be an even more welcome change when Cape Verde filters down to laptops in the coming months.

The problem for AMD today isn’t the power/performance curve, it’s the price/performance curve. 16 months ago AMD launched the Radeon HD 6850 at $179 amidst fierce competition from NVIDIA. Ignoring the current price of the 6850 for the moment, on average the 7770 delivers 90% of the 6850’s gaming performance for 90% of the 6850’s launch price. In other words in 16 months AMD has moved nowhere along the price/performance curve – if you go by launch prices you’re getting the same amount of performance per dollar today as you did in October of 2010. In reality the 6850 is much cheaper than that, with a number of cards selling for $159 before a rebate, while several more 6870s sell for $159 after rebate. The 7770 is so far off the price/performance curve that you have to believe that this is either a pricing error or AMD is planning on quickly halting 6800 series production.

Now to be fair there’s more to consider than just performance in existing games. The 7770 supports DX11.1, VCE, PowerTune, Fast HDMI, and other features the 6800 series doesn’t have, and it does all of this while consuming around 25W less than the 6850. But that’s just not enough. DX11.1 is a point update that’s still the better part of a year away and will only offer a tiny number of new features, while VCE is AWOL and cannot be evaluated, and Fast HDMI will be a niche feature for use with extremely expensive TVs for some time to come. This is not like the 4000/5000 series gap – today and tomorrow the 7000 series will only offer marginal feature benefits. The best argument for the 7770 is the power difference, but considering that both the 6850 and 7770 require external power anyhow that 25W difference is unlikely to matter.

The 7700 series is a fine lineup of cards, but AMD has finally shot itself in the foot with its conservative pricing. The 7750 can ride on the sub-75W niche for now, but the only way the 7770 will make any sense is if it comes down in price. Until then AMD’s worst competition for the 7700 series is not NVIDIA, it’s their 6850.

With that said, the 7700 series clearly has potential. XFX’s R7770 Black Edition S Double Dissipation does a great job demonstrating this with its virtually silent operation, while the card’s factory overclock largely closes the performance gap with the 6850. With its combination of performance and power consumption the 7700 series will be AMD’s midrange workhorse for 2012, of that there is no question. Now it’s simply up to AMD to make it so. After all there’s no such thing as a bad card, only bad prices.

Power, Temperature, & Noise
Comments Locked

155 Comments

View All Comments

  • Hellbinder - Thursday, February 16, 2012 - link

    sorry for the type the 6000 list should have started with 6900
  • Belard - Thursday, February 16, 2012 - link

    This modern new 7750 at $110 performs about the same as a 4 year old 4850... which was selling for $100 about 3 years ago. The 4800 series was more expensive to produce and drink more power.

    The names don't mean much anymore, they were stupid to change the stack names which were fine from the 3000~5000 series.
  • delirious_nomad - Thursday, February 16, 2012 - link

    there are reviews out there and 7770 X-Fired smoke a single 580 and for $300 some odd dollars...

    I have been out of PC gaming for along time and these are going to be my cards of choice.

    reasons why... I don't care what the Jones do...

    I play at 1080p on an adequate LCD TV... and I don't need graphics maxed to the gills...

    I have older games Half Life, Jedi Knight, Knights of the Old Republic, Diablo, Morrowind, etc etc that I still want to play and the power down features and low power usage are great boons for me.

    from the X-Fire reviews I've seen so far they scale at about 2x so just double the numbers and they smoke a single 580 while using less power and running nice and quiet...

    also it gives me a year or so before I build my second system and who knows what will be out then. then this gets handed down to my son and off we go. and it should be plenty fast enough for Minecraft

    the only card that comes really close for me is the 560 Ti 448 Core... and one of them is more expensive and doesn't beat a 580...

    here is link to techpowerups' X-Fire review... http://www.techpowerup.com/reviews/HIS/HD_7750_777...
  • KineticHummus - Thursday, February 16, 2012 - link

    "offers performance close to NVIDIA's GeForce GTX 570."

    That is straight from the techpowerup link you gave, on the conclusion page. Close to gtx 570 isnt smoking the 580 which is what you stated cf 7770s will do...
  • CeriseCogburn - Wednesday, March 21, 2012 - link

    LOL - man ... thanks.
    Anyway there's a triple fan GTX 580 on egg for $359.
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    ---
    I said it to you so I won't get attacked but maybe the gentleman would like to reconsider in light of our helpful posts.
  • papapapapapapapababy - Thursday, February 16, 2012 - link

    this is all nice but sorry, I got burned waaaaaay to many times by AMD BS to even care! having to wait for months for proper support, faulty drivers and mind bogglingly piss poor performance per dollar in the latest games = never again going to buy or recommend any sort of AMD graphic solution. Im going to wait for the next gen consoles to launch, and then im going to get the absolutley cheapest and most efficient nvidia solution that offers me twice the performance of whatever m$, sony chooses to put inside their lil crappy casual boxes. Just like i Always DO, but his time AMD is out of the peculation for good! se ya.
  • BoFox - Friday, February 17, 2012 - link

    The only source of this is a slideshow from AMD regarding the launch of Barts GPUs.

    And then AMD launched Bulldozer with a slideshow saying that it has 2B trannies. A few months after launch, AMD admitted that it was an error, saying that it has only 1.2B trannies.

    I've done such an extensive performance analysis to conclude that all Barts-based GPUs (6870, 6850, and 6790) are VLIW4-based just like its Cayman cousins.

    GCN appears to be around 10% more efficient than VLIW4 for games overall, but it's very hard to say exactly how much. If a 78xx card that comes out next month has very similar specs to a VLIW4-based card (heck, or a VLIW5-based one), it'd be much easier to say. Still researching on this...
  • CeriseCogburn - Wednesday, March 21, 2012 - link

    Man it's just amazing...
    " And then AMD launched Bulldozer with a slideshow saying that it has 2B trannies. A few months after launch, AMD admitted that it was an error, saying that it has only 1.2B trannies. "
    I see that now...
    " Update: AMD originally told us Bulldozer was a 2B transistor chip. It has since told us that the 8C Bulldozer is actually 1.2B transistors. The die size is still accurate at 315mm2. "
    http://www.anandtech.com/show/4955/the-bulldozer-r...

    Ok, there's just no way this core transistor miscount is a mistake. The level of incompetence requried for that to be a mistake and not a PR plot is staggering.

    For that reason BoFox I don't doubt what you are saying about Barts 68xx's and 6790...

    I'd sure love to see that exposed far and wide as well if true. It's just amazing to me, a staggering "error" on the most basic bulldozer spec... and we're supposed to just pretend it never happened and no explanation is ever given.

    Yes there's a chance you are correct Bofox on your calculations, certainly cannot put it past amd given the track record.
  • maree - Friday, February 17, 2012 - link

    From the consumer's point of view, the 7750 is the best card which doesn't require external power connector and hence can fit in with standard case and SMPS. The 7770 makes sense for those who crossfire, esp with long idle and compares favourably against the 6950/6970, esp for somebody who plans to buy only 1 7770 now and another one later when a better deal is available

    From AMD's point of view, the 7750 seems to be targeted at the 80% of the Market who buy Intel PC, but are envious of the graphics capabilities of the puny Llano and even tinier Brazos. The 7770 seems to be targeted at the same folks for whom the BD was targeted. If somebody was prepared to buy a product(BD/7770) which is priced closer to competition(Core-i5/Gtx560) but gets beaten in all benchmarks and is priced more than old generation(PhenomII/6850) but still loses to it in many benchmarks. In short the 7770 is a Bulldozeresque disaster.

    The situation would have been much better, if they had marketed these cards as 7670 and 7750, because that is were they belong based on die/transistor size and performance. Definitely a slip-up from AMD graphics Marketing dept.
  • Galidou - Sunday, February 19, 2012 - link

    LoL rarson, let's not get into that kind of argument with chizow, you'll end up in WW2 history of tanks pricing failure due to the fact they were not double the raw power of last gen tanks from x company vs y company... history, history, history...

    Be careful with Chizow's arguments, he lives in the past, nothing new to reading his comments, it's already in the books and ready for anyone that reads it to perceive it the way they want(different from an ATI or Nvidia fanboy point of view)... :P

Log in

Don't have an account? Sign up now