AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • DMCalloway - Wednesday, November 6, 2013 - link

    Not sure how to interpret your analogy. Heat is measured in watts (energy) needing dissipation. This card is running up to 95*C while drawing around 50 more watts of power than a 780 . The 780 while drawing 50 less watts of power is running up to 80*C. Out of the gate the heat sink on the R290 is going to have to be at LEAST 20% more effective than the current 780 heat sink, and this thermal excess is still being pumped into the case.
  • Galidou - Friday, November 15, 2013 - link

    Well as I can see you are rather unfamiliar with power leakage due to temperature... anyway I won't get into the subject, I'll let you google it. Getting a card to run a lot cooler at the same frequencies can reduces power usage depending on how bad it affects certain node.

    A 20 degree celsius difference can have a great impact on power usage which I think might turn the tide around. In 28nm design, power leakage has been a bigger problem than before, probably a reason why Nvidia has been using better reference coolers thus, enhancing the performance/watt for the last generations of video cards.

    That is the basis of my previous analogy.
  • TrantaLocked - Tuesday, November 5, 2013 - link

    With the card in the case, with headphones on, with sounds/music from the game playing, hearing the 290 or any card would be tough. I know what loud video cards sound like (I owned the 4890 with a single fan design, and I always ran the fan at manual 40%-50% speed for gaming), and when playing a game with headphones the sound is barely audible and definitely not distracting.
  • Calinou__ - Tuesday, November 5, 2013 - link

    I have a reference 570 in a sound dampened case and a headset and I can easily hear it, even at idle (40% fan speed)... and I guess the R9 290 is more noisy.
  • Galidou - Tuesday, November 5, 2013 - link

    A reference 570 if not cleaned regularly(once every two months) goes easily up to 62 db. GTX 480 went up to 64 db(brand new) and Nvidia fans praised it even if AMD fans were saying it was loud, nothing new in here... Nvidia fans once said that 64 db is ''nothing'' for a good performing card... Look at them now speaking and whining about the same situation, really fun to see the tides turn around...

    That was in a time when reference coolers where much more used and represented a MUCH bigger % of the market, I would never buy an AMD reference cooled card. Nvidia fans started to care about noise and temperature when they got the 600 series out and for the first time had an advantage.
  • Finally - Tuesday, November 5, 2013 - link

    Whoever buys a card with the default fan? Get one with a custom design and problem solved. How hard is that?
  • Aikouka - Tuesday, November 5, 2013 - link

    Anyone that buys it right now? =P
  • kmmatney - Tuesday, November 5, 2013 - link

    I used to think that too, until I actually bought a loud HD4890. I ended up having to get an Accelero, which made it dead quiet. My current NVidia based card is also really loud, but I fixed that buy underclocking it most of the time. Really loud cards are just not fun to deal with. I'd wait for third party cards to come out with better cooling solutions.
  • hoboville - Tuesday, November 5, 2013 - link

    Except that it's hot, noisy, and basically pointless to overclock because it's as loud as a medium party without music. So as it is, it's a bad card.

    However, since it's only $400, one could go out and buy a water cooling setup and have a card faster than the 290x for about the same or less money. For Titan money, you could get a second 290 and add some more rads to your setup. The thing is...if you do want to OC (and why wouldn't you with water cooling?), you'll have to dissipate over 800 watts of heat...
  • Slomo4shO - Tuesday, November 5, 2013 - link

    Well done AMD. Competition at its finest!

Log in

Don't have an account? Sign up now