Final Thoughts

Throughout our entire review we’ve been calling the Radeon R9 285 a lateral for AMD, and as we’ve seen in our results this is for a good reason. Despite all of the architectural and feature changes between the R9 285 and its R9 280 predecessor – everything from the GCN 1.2 feature set to color compression to the smaller VRAM pool – the R9 285 truly is a lateral for AMD. At the end of the day it brings a very minor 3-5% performance increase over the R9 280 with virtually no change in price or power consumption. Functionally speaking it’s just an R9 280 with more features.

To that end laterals like the R9 285 are currently an oddity in the video card landscape, but it’s something that we should expect to see more of in the future. As GPU architectures mature and the rate of progress on new manufacturing nodes continues to slow, we no longer have the same yearly or even biennial shakeup in the GPU landscape. Tahiti at this point is nearly three years old and is still going strong, and the 28nm process it’s built on is going to be with us for a while yet. Which means newer generations of video cards are going to be farther apart, and a new opening is created for smaller refreshes such as Tonga and GCN 1.2.

From a feature standpoint then, Tonga and the underlying GCN 1.2 architecture is a small but nonetheless impressive iteration on what AMD has already done with GCN 1.1. I think it’s going to take some time to really see the impact of the newer ISA, but the improvements to geometry performance and color compression are very immediate and very potent. The fact that AMD has been able to offset a roughly 30% bandwidth reduction just through the use of color compression is certainly a feather in AMD’s cap, and this is only going to get more important over time as we have hit a wall on GDDR5 clockspeeds and memory bus widths, especially on the high-end. Meanwhile AMD’s upgrades to their video decode and encode capabilities should not go unnoticed; AMD has finally caught up to NVIDIA on video decoding – especially in 4K H.264 compatibility – and the ability to encode 4K H.264 in hardware may yet prove advantageous.

As for R9 285’s customer base and its competition, AMD’s product positioning continues to be straightforward. AMD has continued to undercut NVIDIA on a price/performance basis across the entire Radeon 200 family, and R9 285 upholds this tradition. If we’re just looking for the card with the best performance for the price, the R9 285 solidly outperforms NVIDIA’s GTX 760 by 12-15%, and it’s by no mistake that GTX 760 prices have slid in the last week in response.

The ramification of this is that AMD no longer holds a real price/performance advantage – the price gap just about matches the performance gap at this point – but this does mean that the R9 285 is in its own little performance niche as a more powerful but more expensive video card compared to the GTX 760. The end result is that we have a tossup: you could buy either and be satisfied for the price.

AMD’s lineup on the other hand is a bit more volatile and will remain so until R9 280 stocks run out. With AMD’s partners selling off their remaining R9 280 cards at clearance sale prices, the R9 280 is a very strong value proposition at $210-$220, offering virtually identical performance to the R9 285 for $40 less. However like all GPU discontinuation clearance sales this situation will be fleeting, and at some point R9 280 will go away and $250 R9 285 will be the status quo. In the meantime however one is also left with the harder choice of picking price or features; the R9 285 has a few features that in the long run are going to make a difference, such as full support for DisplayPort Adaptive-Vsync (Freesync) and a 4K capable video decoder, but whether that’s worth a $40 premium is going to be very situational if not outright difficult to justify.

All things considered then the R9 285 is a solid card, however I remain unconvinced that AMD has equipped it with the right amount of memory. From a GPU performance perspective I feel that AMD is overshooting in promoting the R9 285 as a 2560x1440 card, as the raw performance to run at that resolution with high quality settings just isn’t there, but even as a 1080p card 2GB for $250 is tough to swallow and is made all the worse by the 3GB R9 280. 2GB for 1080p is enough for now, but whether that will still be true in 2-3 years seems unlikely. A 4GB R9 285 would be a much safer bet as a result, however it doesn’t necessarily follow that it would be worth a price premium at this time.

Switching gears for a moment, second-tier cards like the R9 285 are often not the strongest showing for a new GPU like Tonga. Given all the similarities between Tonga and Tahiti, it seems like it’s only a matter of time until R9 280X gets the Tonga treatment. And even though it would be the second Tonga card, I think it could prove to be just as interesting as the R9 285 (if not more so), as it will give us a chance to see just what an unrestricted Tonga product can do. To that end, I hope AMD doesn’t leave us waiting too long to release a fully enabled Tonga SKU.

Power, Temperature, & Noise
Comments Locked

86 Comments

View All Comments

  • TiGr1982 - Thursday, September 11, 2014 - link

    BTW, is Tonga the only new GPU AMD has to offer in 2014?
    (if I'm not mistaken, the previous one from AMD, Hawaii, was released back in October 2013, almost a year ago)
    Does anybody know?
  • HisDivineOrder - Thursday, September 11, 2014 - link

    The thing is the moment I heard AMD explaining how Tonga was too new for current Mantle applications, I was like, "And there the other shoe is dropping."

    The promise of low level API is that you get low level access and the developer gets more of the burden of carrying the optimizations for the game instead of a driver team. This is great for the initial release of the game and great for the company that wants to have less of a (or no) driver team, but it's not so great for the end user who is going to wind up getting new cards and needing that Mantle version to work properly on games no longer supported by their developer.

    It's hard enough getting publishers and/or developers to work on a game a year or more after release to fix bugs that creep in and in some cases hard to get them to bother with resolution switches, aspect ratio switches, the option to turn off FXAA, the option to choose a software-based AA of your choice, or any of a thousand more doohickeys we should have by now as bog-standard. Can you imagine now relying on that developer--many of whom go completely out of business after finishing said title if they happen to work for Activision or EA--to fix all the problems?

    This is why a driver team is better working on it. Even though the driver team may be somewhat removed from the development of the game, the driver team continues to have an incentive to want to fix that game going forward, even if it's a game no longer under development at the publisher. You're going to be hard pressed to convince Bobby Kotick at Activision that it's worth it to keep updating versions of games older than six months (or a year for Call of Duty) because at a certain point, they WANT you to move on to another game. But nVidia and AMD (and I guess Intel?) want to make that game run well on next gen cards to help you move.

    This is where Mantle is flawed and where Mantle will never recover. Every time they change GCN, it's going to wind up with a similar problem. And every time they'll wind up saying, "Just switch to the DX version." If Mantle cannot be relied upon for the future, then it is Glide 2.0.

    And why even bother at all? Just stick with DirectX from the get-go, optimize for it (as nVidia has shown there is plenty of room for improvement), and stop wasting any money at all on Mantle since it's a temporary version that'll rapidly be out of date and unusable on future hardware.
  • The-Sponge - Thursday, September 11, 2014 - link

    I do not understand how they got there R9 270x temperatures, my OC'd R9 270x never even comes close to the temps they got....
  • mac2j - Friday, September 12, 2014 - link

    It's great that they've caught up with H.264 on hardware and the card otherwise looks fine. The bottom line for me, though, is that I don't see the point of buying card now without H.265 on hardware and an HDMI 2.0 port - 2 things Maxwell will bring this year. I haven't heard what AMDs timetable is there though.
  • P39Airacobra - Friday, October 17, 2014 - link

    It really irritates me that they are making these cards throttle to keep power and temps down! That is pathetic! If you can't make the thing right just don't make it! Even if it throttles .1mhz it should not be tolerated! We pay good money for this stuff and we should get what we pay for! It looks like the only AMD cards worth anything are the 270's and under. It stinks you have to go Nvidia to get more power! Because Nvidia really rapes people with their prices! But I must say the GTX 970 is priced great if it is still around $320. But AMD should have never even tried with this R9 285! First of all when you pay that much you should get more than 2GB. And another thing the card is pretty much limited to the performance of the R9 270's because of the V-Ram count! Yeah the 285 has more power than the 270's, But whats the point when you do not have enough V-Ram to take the extra power were you need a card like that to be? In other words if you are limited to 1080p anyway, Why pay the extra money when a R7 265 will handle anything at 1080p beautifully? This R9 285 is a pointless product! It is like buying a rusted out Ford Pinto with a V-8 engine! Yeah the engine is nice! But the car is a pos!
  • P39Airacobra - Friday, January 9, 2015 - link

    (QUOTE) So a 2GB card is somewhat behind the times as far as cutting edge RAM goes, but it also means that such a card only has ¼ of the RAM capacity of the current-gen consoles, which is a potential problem for playing console ports on the PC (at least without sacrificing asset quality).

    (SIGH) So now even reviewers are pretending the consoles can outperform a mid range GPU! WOW! How about telling the truth like you did before you got paid off! The only reason a mid range card has problems with console ports is because they are no longer optimized! They just basically make it run on PC and say xxxx you customers here it is! And no the 8GB on the consoles are used for everything not for only V-Ram! We are not stupid idiots that fall for anything like the idiots in Germany back in the 1930's!

Log in

Don't have an account? Sign up now