AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • Leyawiin - Tuesday, November 5, 2013 - link

    I'd just wait for the ASUS DirectCU II version or something equivalent. Something as hot, loud and power hungry as the old GTX 480 isn't acceptable to me, but drop a couple of those cons and I'd be on board.
  • FuriousPop - Tuesday, November 5, 2013 - link

    No... But..........the.......................noise...................is........... just............too........ loud.......... are....you......getting..............this.......

    *puts on headphones*

    Now then, as i was saying its very loud but but i want it whisper quiet, so buzz off else where then. your 2cents here is not appreciated.

    as a CFx7970 owner (not to mention i had 2xgtx670's just before that which 1 became DOA and yes just as loud as current GPU's) i can safely say - noise is NOT a reason to be placing the whole argument onto when deciding about price/performance wise when there are sooo many different things you can do to reduce the noise generated from your case - if your unwilling to then obviously logic dictates that you would NOT purchase this, clearly.... but but i still wanna compare my 6 month old GPU to this one....... of course you can junior.... of course you can...

    Custom coolers will come and will reduce the temps/noise, maybe not by a massive amount, but maybe just enough to convert some of those green boys over!
  • stangflyer - Tuesday, November 5, 2013 - link

    I am a older gamer at almost 50. I have had many cards since my first 3DFx card. Both AMD/ATI and Nvidia have been in my cases. I have a 1440p monitor but also game at 5040x1050 eyefinity.
    Currently run 2x7950 sapphire flex boost cards. They run relatively quiet as I have a empty slot in-between the cards.

    I listened to some of the sound clips of the 290/x and they reminded me of my 5970 that I ran before my 7950's. I swore I will never have anything that loud in my pc again.

    Will wait and see what the custom coolers bring to the table as I am hoping to go to one card even though I know I will lose some performance. Or just wait for 20nm.

    I was over at my cousins and he showed me his new 780gtx with the acx cooler. Mild OC and it was extremely quiet.

    I will play with either red or green cards but I do know that I will pay 100 bucks for the noise diff of the 780gtx.
    We will see.
  • lnanek - Tuesday, November 5, 2013 - link

    Hmm, loud cards are good for me, I always use earbuds anyway.
  • Sancus - Wednesday, November 6, 2013 - link

    Everyone who says "just use headphones" probably doesn't realize that these AMD cards are so loud that they would actually be quite disturbing to anyone else in the room, and in the case of Crossfire, probably your entire house or adjacent apartments. 2x 290X's in CF Uber mode are approaching vacuum cleaner levels of noise.

    Not recommending these cards due to noise is not 'biased' it's merely a common sense, practically based choice.
  • ClexRex - Wednesday, November 6, 2013 - link

    Agrred ive played with the 290x and unless you keep your pc in the other room defiantly hold out till the aftermarket cool is here otherwise youll be pissed and havto spend another 50 on aftermarket cooling once available.

    Also a lot of people hatng on the 780 forget one thing..that it overclocks better than the 290/290x flat out...it also have the option for custom bios which in return will boost the 780 above the 290 and do it at a lower power comsumption/noise/heat

    Also to crossfire the 290 you will need a min. of 1000w psu as we ran intoissues with 800w psu's during testing with crossfire and heavily overclocked cpu's.
  • rtho782 - Wednesday, November 6, 2013 - link

    This card actually hits louder noise levels than the old FX5800 Ultra! http://techreport.com/review/4966/nvidia-geforce-f... Not exactly the same method to measuring but this was 10 years ago...
  • AnnihilatorX - Wednesday, November 6, 2013 - link

    Power consumption will be identical while performance will be down, so efficiency will be slipping and 290 will have all the same power/cooling requirements as 290X.


    The above statement I feel is an over-simplication.

    I would imagine the actual instantaneous clock-for-clock power consumption will actually decrease due to the lower computation units. However, R290 or R290X cannot sustain their boost clock and are nearly always throttled by their thermal limits. Hence, the practical power consumption is similar, and since at the same power output R290 would have to have a higher clock to match the speed to R290X, the theoretical efficiency is somewhat lower, but I don't think they differ by much.
  • UGMan - Wednesday, November 6, 2013 - link

    Come on AMD, get this beauty out with coolers from ASUS, MSI, Sapphire et. al. and then TAKE MY MONEY ! Please !!!
    Nothing out there touches it for the price, and AMD have finally sorted out crossfire. I've got a feeling that Mantle is going to shock and awe with it's performance.

    Bring it on !!!
  • Vorl - Wednesday, November 6, 2013 - link

    for everyone using stupid sound comparisons, like "a 747 taking off". Here are some real comparisons.

    Whisper Quiet Library at 6' 30dB
    Normal conversation at 3' 60-65dB
    Telephone dial tone 80dB
    http://www.gcaudio.com/resources/howtos/loudness.h...

    I think that the reviewer is biased considering how big a deal they make of noise now, but in the past with noisy nvidia cards it was more like "meh, they are noisy, BUT FAST". Now they are all over "how loud the card is".

Log in

Don't have an account? Sign up now