AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • just4U - Wednesday, November 6, 2013 - link

    You have to ask yourself is Ryan biased with Nvidia or AMD... or maybe it's simply just his tolerance for noise that is the issue.

    Anyway.. people buying these cards will have some options. For me the 95C is a no go as is the noise. Something I'd tolerate until a good aftermarket solution could be implemented. AMD and Nvidia (until their titan reference cooler) have always been a little meh.. with reference coolers. We all know this..

    My last two cards have been AMD ones and if I was in the market for a card today I'd go straight for the Nvidia 780. Not because of it's speeds, certainly not because of its drivers, and not because I am a fan. I simply like their kickass reference cooler and games bundle.

    Im not in the market though lol. Quite happy with my Radeon 7870.. and not looking to upgrade yet.
  • jbs181818 - Thursday, November 7, 2013 - link

    With all that power consumption, what size PSU is required? Assuming 1 GPU and a haswell CPU, 1 SSD.
  • dwade123 - Thursday, November 7, 2013 - link

    290x doesn't make sense when the cheaper 290 performs almost identical. And neither can max out Crysis 3. Gamers are better off waiting for real next-gen cards like Maxwell, and with next-gen console ports coming in 2014 suggests it is common sense to do so.
  • polaco - Tuesday, November 12, 2013 - link

    "neither can max out Crysis 3" what the hell are you talking about?
    52 fps at 2560x1440 HQ + FXAA
    77 fps at 1920x1080 HQ + FXAA
    with that line of thinking then nor 780 or Titan are worthy since fps diff is minimal

    "gamers are better off waiting for real next-gen cards like Maxwell"
    well, 290 and 290X are AMD true next gen cards, maybe you feel fooled by having bought a 780 for almost 700 bucks and then you feel like Maxwell will relief that pain, or maybe you work for NVidia marketing deparment... for the time NVidia came out with it AMD will be pushing their next gen too, will you recommend waiting then too? so we wait forever then uh?
    "and with next-gen console ports coming in 2014 suggests it is common sense to do so"
    you mean to wait for NVidia card to run games that will be optimized to AMD hardware that is inside every next gen console?
    please go to see a doctor....
  • TempAccount007 - Saturday, November 9, 2013 - link

    Who the hell uses a reference cooler on any AMD card? The only people that buy reference cards are those who are going to water cool them.
  • NA1NSXR - Monday, November 11, 2013 - link

    If I was in the market for a card I'd wait until the aftermarket cooler designs come out. Should make the noise and temp situation a little more bearable. Still, the proprietary nVidia value-adds like HBAO+, adative vsync, TXAA, etc. are hard to give up for me. It is a hard call. If the 780 was only $50 more than the 290 I'd take the 780, but since the difference is $100....I don't know. Really tough call.
  • beck2448 - Tuesday, November 12, 2013 - link

    Too noisy and hot.
  • devilskreed - Tuesday, November 12, 2013 - link

    Hail High AMD..The gamers saviour!!!
    Hail High AMD..The price/performance king
    Hail High AMD..The peoples choice..

    Healthy competition from AMD's side,i stopped buying nvidia after 8800GT :p purely due to price/performance benefits that AMD offers..
  • bloodbones - Thursday, November 14, 2013 - link

    The battle between amd ex ati and nvidia has been around since i was 18 years old and i am 30 now. Over the years i have try a huge numbers of video cards from both companies and the only conclusion is that things have always been the same, nothing change over the years: more or less the same performance and:
    Nvidia = more expensive cards but more quality cards, lower noise levels lower temps
    Ati/Amd = cheaper cards with higher noise levels higher temps
    Period.
  • horse07 - Thursday, November 14, 2013 - link

    Guys, when will you update the 2013 GPU benchmarks with the recent R7/R9 and 700 series?

Log in

Don't have an account? Sign up now