AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • swing848 - Tuesday, November 5, 2013 - link

    It will only get loud for me when playing games or the occasional benchmark. During games I wear

    a headset, and during benchmarks I can leave the room. I have a room dedicated to computer use

    and the house has good sound proofing, so, it will not bother other people.

    If I want it quiet I will use a water cooler with a large radiator and fan, my Cooler Master HAF

    922 case already has sealed holes for tubing for an external radiator.

    Water cooling is better than dumping all the hot air from the video card into my case, even if it

    is well cooled with 200mm fans. I overclock my CPU and I do not want it, RAM, or chips on the

    motherboard to get any hotter than necessary.

    The only thing I will miss on this card are Black Diamond Chokes and Digital Power 8+2+2 phase

    used on the Sapphire R9 280X Toxic [Black Diamond Chokes are also used on Sapphire R9 280 Vapor-

    X]. To be honest I do not know how many mosfits are dedicated to cool GPU functions on the R9

    290. In any event, both the Toxic and Vapor-X dump hot air into the case.

    Another thing I would like to have seen on the Sapphire R9 290 is a metal back plate.
  • somethingwicked - Tuesday, November 5, 2013 - link

    what the holy gee whiz

    the new AMD drivers are insane! 290x is speeding past Titan now and 780 is a turtle while 290x is a ferrari... the new 290 is performing like the 290x was at launch and now the 290x is a card unto its self at the top of the food chain

    thank goodness for competition

    i smell more deep nvidia price cuts cause AMD is kicking butt
  • TheJian - Tuesday, November 5, 2013 - link

    rofl. So NV just has to release a "fan that drives you out of the room" driver now to respond. IF NV did this all of you would be falling all over yourselves to moan and groan claiming NV was cheating. AMD does it, and wow this is awesome, I love the noise anyway...LOL. Technically this is all NV has to do though right? Raise the fan speed until it hits another 10DB's and blow them down again.

    Perf is great, but not if it drives me out of my room. There is nothing stopping NV from adding 10DB's to their cards and calling it a day. But I don't want this being called normal. IMHO this is a crap way to get perf and a game both sides can play. If NV does this tomorrow and says we're hiking prices because if we overclock our cards also (which is essentially what they're doing here, just reverse, raise fan so clocks boost higher, same story) we blow AMD away.
    https://www.youtube.com/watch?v=djvZaHHU4I8
    Both cards clocked to max (290x vx. 780, NOT TI mind you).
    Fast forward to 8:40 for benchmarks...AMD is blown away. The 780 didn't lose ANY game. Not one. And in star citizen blows AMD away. I don't believe these cards will be used mostly on 1600P or 1440p either. 1080/1200p is running on 98.5% of our screens and a large portion of the 1.5% that is above these two resolutions are running TWO or more cards. Steam's surveys don't lie. Already they turn down nearly every game at 1440p here which to me means you won't run there anyway (they are not reporting mins here for most games). Everything is pretty maxed in linustechtips vid above at 1080p, and you should be able to stay above 30fps (probably?) doing it. They are reporting avg's here at anandtech and already turning stuff down. Meaning maxing graphics on a lot of games would be unplayable under 30fps especially when lots of crap is going on. LOW DETAIL? Seriously? So Ryan is assuming you'll buy one of these cards (or any single gpu card) to then go out and buy a 1440p monitor (which are still over $550 for any brand you'd recognize the name of on newegg, and start there when you choose NEWEGG ONLY) and then run the details on low to play games? I don't think so. If he's assuming we're all going to buy new monitors, might as well get Gsync instead (though I'd say wait for more models first even if Asus has a decent one out of the gate). Heck some of the games have details DOWN on 1080p here (total war2, medium shadows? still looks like it would hit below 30fps on most cards).

    For anyone saying get a water block...LOL. How much did my card cost if I have to add that and how many regular users even know what water is or are even capable of adding one? I say that as a guy who has as Koolance kit. Or even adding an aftermarket fan. Isn't this upping the cost of the card then?

    Both solutions are unacceptable and attempting to fix a problem caused by shipping a card that already is unacceptably NOISY, hot and sucking up watts vs. it's competition. No games either.

    Add on top reviews elsewhere show other games that give an opposite story. Techpowerup, Techspot, Guru3d, Hardocp show wins for AC3, COD Blackops2, Diablo3 (spanked by NV), FarCry3, SplinterCell Blacklist, Star Craft2 (spanked, heck 770 does well), World of Warcraft (spanked), Skyrim (lost 3 resolutions, 290/x won 1600p, oddly lost upping to 5760), Resident Evil 6.

    So maybe you need to take a bit of a WIDER view than anandtech ;) I don't call 53 more watts than 780 (or 70 more than 770) a victory. Never mind the noise it creates while doing it. Is running 10DB's higher really a better card? Do people here realize that noise in DB's is EXPONENTIAL? A 10Db noise difference is HUGE (ryan did say 2x as loud). 12 degress hotter for this kind of perf isn't good either. I see a clear reason NV should be charging more than AMD's cards. They are better. I can OC and beat them easily without all the noise, heat, watts. I don't need a waterblock to do this either...ROFL. Whatever I already have on my card can do this easily and come in UNDER Ryan's 7970 noise levels that he calls acceptable.

    I see no price cuts, but probably a few more videos poking fun at AMD like this:
    https://www.youtube.com/watch?v=oV5vs27wnCA
    Tell me that isn't funny ;) I'm wondering if someone at NV paid for this vid to be made...LOL. Sparks, dripping fire, ROFL. Great job of getting the message across.
  • rviswas11 - Tuesday, November 5, 2013 - link

    i couldn't give two s***ts because when i game i wear noise cancelling headphones that i use for listening to music on the bus
  • rviswas11 - Tuesday, November 5, 2013 - link

    so i'm happy. and with the exception of skyrim i could set the graphics card to 10% fan profile and max the game. i run quite afew mods for skyrim don't play it anymore.
    but i can see how the noise can be a major issue for a lot of people.
  • Morawka - Tuesday, November 5, 2013 - link

    your not understanding. At this level of noise, not even noise canceling headphones will drown it out. This is louder than a 747 jet from 100 meters away.
  • mgl888 - Tuesday, November 5, 2013 - link

    You'd better be trolling.
    60dB is the intensity of a normal conversation.
  • Galidou - Tuesday, November 5, 2013 - link

    Comon, a 747 jet 100 meters away... you gotta be freaking stupid to only think it could be true :O
  • ahlan - Saturday, November 9, 2013 - link

    @Galidou
    What do expect from nvidia fantards who think higher price is better.
    They are so unsecure and delusional that they comment in every AMD review...

    Nvidia is loving their stupidness...
  • 1Angelreloaded - Saturday, November 16, 2013 - link

    120 decibels is a 747 I would know I work around them, But AMDs new cards are crap, 550$ to add a water cooling system totaling around 200-300$.......either way its not a deal really, Nvidia cards have more headroom on air and water, while AMD's cards are impressive to an extent Hawaii is a bastard step child of sorts, it is probably a failed version of the next chip to compete with Maxwell but had no choice but to release it.

Log in

Don't have an account? Sign up now