AMD Radeon HD 7750 & Radeon HD 7770 GHz Edition Review: Evading The Price/Performance Curve
by Ryan Smith & Ganesh T S on February 15, 2012 12:01 AM EST- Posted in
- GPUs
- AMD
- HTPC
- GCN
- Radeon HD 7000
Theoretical Performance
Before moving on from compute performance, we wanted to quickly take a look at theoretical performance. Identifying the theoretical performance of the 7700 series in relation to other cards may help explain why it’s often trailing the 5770 and 6850.
A quick look at texture fillrates gives us our answer for the 7750: it has even lower texture performance than the 5750, never mind the 5770. Thankfully very few games are heavily texture bound these days – and if they were the 7750 likely wouldn’t have enough VRAM for them anyhow – but the massive gap in theoretical texture performance between the 7750 and7700 means that the 7750 is behind virtually everything else.
Conversely if you look at the pixel fill rate it’s almost identical to the 7770, which in turn trails the 5770. However in this case the 3DMark Pixel Fill test appears to be heavily memory bandwidth bound, which is why it trails the 6870 by so much.
Moving on, looking at tessellation performance is both good and bad for the 7700 series. With a maximum of 1 triangle/clock, GCN’s tessellation improvements can only do so much. It’s enough to vault past the 5770, but the 6870 still has better tessellation performance even with its lower clockspeed. Given AMD’s use of off-die buffering, it’s entirely possible we’re looking at a memory bandwidth constraint here.
Unigine Heaven backs these findings and then some. Tessellation performance is improved relative to the 5700 series, but at best the 7700 series is only going to catch the 6850.
155 Comments
View All Comments
Dianoda - Wednesday, February 15, 2012 - link
Yeah, I jumped on that BB/Visiontek HD4850 512MB deal as well. Bought the card about a week before the official launch and at a $50 discount on top of that. Timing was perfect, too, as I had just finished my build, short one video card (borrowed a 3850 from a friend for a few weeks).I finally upgraded from that card to a 6950 2GB (BIOS modded to 6970) about a month ago - Skyrim was just too much for the 4850 to handle @ 2560x1440. The 6950 2GB is a great card for the price if you're willing to perform the BIOS mod (and don't mind rebate forms).
nerrawg - Thursday, February 16, 2012 - link
Exactly! I bought 2 4850's in the UK in 2009 for £65 ($95) each - best GPU purchase I have made in 12 years! I now have a single 6870 that I bought in 2011 for £120- but its not really an upgrade at all. Thought I would wait and get a second one cheaper but now I don't think that will happen2008-2009 was the sweet spot of a decade for Desktop GPUs. The way things are going with the Desktop (AMD bullsnoozer etc. etc.) I fear that it might have even been the sweet spot of GPU performance for the decade to come as well. I would love to see some massive progress in graphics, but it seems that the all the "suites" care about now days is "smart" this and "smart" that. I can't really blame them either, because until pc programmers get their act together and actually start making apps and games that push what is possible on current hardware I don't see any reason why I need 2X the GPU and CPU compute power every 1-2 years.
Come on guys - we are all waiting for the next "Crysis" - if it doesn't come then it might spell the end of the enthusiast desktop
StrangerGuy - Wednesday, February 15, 2012 - link
AMD having a fail product at $160 that couldn't even beat an almost 1.5 year old $150 6870 isn't surprising considering they are also the ones with the cheek to price their FX-8150 at near 2600K prices.thunderising - Wednesday, February 15, 2012 - link
The only problem I have with AMD on this card is WHY THE LOW BANDWIDTH.The card performs nearly 10% faster when the memory is clocked at 6GHz QDR (TPU reviews) and 15% with Core Clock matching XFX's OCed Speed.
I think that 6GHz memory modules would have taken the HD7770 a long way ahead. The performance boost would have been enough to hit HD6850 performance, or beat it in all cases, and at that point, this card at 159$ would make sense.
Right now, until the price hits about 129$, this doesn't make sense.
chizow - Wednesday, February 15, 2012 - link
But you get GCN, 28nm and a bottle of verdetard?Zoomer - Wednesday, February 15, 2012 - link
GCN is worse than useless for gamers and non compute users.jokeyrhyme - Wednesday, February 15, 2012 - link
I think I've built my last system with an AMD CPU. Intel completely abused their monopoly and decimated AMD's success in the CPU department, and I don't think AMD will have an enthusiast-quality CPU ever again. :(That said, I think I will still use AMD GPUs for a while yet.
nVidia's Kepler may beat AMD later this year, but AMD actually has an open-source driver developer on staff and routinely publishes hardware documentation. AMD GPUs will probably have better support for Wayland than their nVidia counterparts due to these factors. If you use Linux and want to stay on the cutting edge, then I don't think picking nVidia is particularly wise.
ganeshts - Wednesday, February 15, 2012 - link
At least in the HTPC area, NVIDIA is miles ahead of AMD in the open source support department.Almost all Linux HTPCs capable of HD playback have NVIDIA GPUs under the hood, thanks to their well supported VDPAU feature.
AMD started getting serious with xvBA only towards the end of last year, and they still have a lot of catching up to do [ http://phoronix.com/forums/showthread.php?65688-XB... ]
Ananke - Wednesday, February 15, 2012 - link
AMD is several years behind NVidia on the compute side...actually they are nowhere as of today. The AMD 7xxx series is so ridiculously priced, it will not get an user base to be attractive for developers. Actually, I am at the point of considering NVidia cards for computing, despite that I hate their heat and power consumption.AMD had their chance and they blew it.
Besides, we shall see where the AMD ex-VP will go - that company most likely will be the next big player in graphics and high performance computing. Probably Apple.
PeskyLittleDoggy - Thursday, February 16, 2012 - link
In my country, company policy dictates you cannot leave your company and work for a competing company if you have valuable R&D knowledge. Thats part of the restraint of trade clause in your contract.Basically what I'm saying is, AMD's ex-VP will not be able to work in any company with a graphics department for 2 years if the contract is similar to mine. I can't remember now but some CEO was sued for that recently.