AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • YazX_ - Tuesday, November 5, 2013 - link

    This is the case for reference designs, i wouldnt expect that custom designs will suffer from noise and heat issues, as an example, 770 GTX reference temp under load is 80c, i have 770 gtx Gigabyte OC windforce3 andi have never seen the GPU temp reaching 65.

    290 for now is the best bang for the buck, gr8 job AMD, and for us, it means another price cut from Nvidia which is the best part in these competitions.
  • yeeeeman - Tuesday, November 5, 2013 - link

    I think there is too much critic made on the subject of noise. You get a water block and the problem is solved. And you also pay less money for this combo, than what you would've pay'd for a GTX780.
    But, the thing that really strikes back from this review, is the unused potential of the 290X. Just imagine how it would run unlimited by its cooling system. I think it could hold its own against the comming 780Ti from nVidia.
    And, we should stimulate AMD, because if it weren't for them, nVidia wouldn't ever dropped their prices. Now, they are even releasing the full GK110 core, at a smaller price than Titan.
  • TheJian - Tuesday, November 5, 2013 - link

    So how many people you think there are running water? Also how much does that add to the cost of my shiny new hot card? Newegg isn't likely to be shipping water cooled cards by the millions...LOL. You are aware this is a REF NV card tested too right?

    Tomshardware seems to think noise and heat and how it runs IN GAMES after a period of time is an issue:
    http://www.tomshardware.com/reviews/radeon-r9-290-...
    "On the R9 290X we received from AMD, and in the seven games we tested, a 40% fan speed is good enough to average about 874 MHz. But when you’re actually gaming on a hot card (and not just benchmarking a cold one), our two-minute Metro: Last Light test suggests you’ll be spending more of your time in the upper-700 MHz range. In fact, in some titles, you’ll dip under 1000 MHz before even getting out of the menu system and into the action (Arma and BioShock).

    You could call that questionable marketing. After all, the only way you’ll actually see a sustained 1000 MHz is if you either let the R9 290X’s fan howl like a tomcat looking for action or play platform-bound games."

    How fast will these be after running a few hours in game? Or even 1hr? Most won't purchase water for any gpu so they'll be dealing with it as it is or with a better fan at some point (assuming OEM's get them soon):
    "AMD’s scheme undoubtedly suffers a lack of clarity, and after piling praise onto the R9 290X’s value story, I now have to hope that Nvidia doesn’t follow AMD down this muddy little rabbit hole."

    I hope they don't follow AMD and release a driver tomorrow doing the same crap too. But if I was them that is EXACTLY what I'd do in response along with videos everywhere explaining this should not be done, "but we have to do the same crap the other guys pull to beat us and when pulling this crap, you can clearly see there is no reason for us to drop prices agian" :) Or something like that...LOL. I really think AMD is going to be hurt by people buying the first rev and complaining all over forums about their silent PC (or nearly noise free) sounding like a jet engine now. Also as toms points out, are people really going to get the full perf while playing for longer than COLD periods that reviews bench under? A lot of people game for HOURS and they noted it slowing down even in menus before even playing the actual game. How does this card affect the rest of the temps in your PC during hours of playing? How does a gpu temp of 94 affect the PC vs. 81 for NV? I'll take 81 thanks.
  • 1H4X4S3X - Tuesday, November 5, 2013 - link

    Does anyone remember Ryan Smith's review of the even louder GTX 480?
    Anyone who can't see the bias is blind himself.
  • Drunktroop - Tuesday, November 5, 2013 - link

    You treat FurMark as a more meaningful/real-life test than Crysis 3?
  • silverblue - Tuesday, November 5, 2013 - link

    Shh.

    The simple fact is, whilst incredibly fast, this baby needs modding with a better cooling solution. Early adopters are usually hit the hardest, so I'd recommend waiting until customised cards hit the shelves. A GTX 780-style cooler would be very interesting indeed.
  • Ryan Smith - Tuesday, November 5, 2013 - link

    Which is actually why we threw in the GTX 480 in our charts for this one, to offer some perspective. The GTX 480 is quieter than the 290.
  • chizow - Tuesday, November 5, 2013 - link

    And certainly you must account for the possibility that expectations and tolerances change over time right? Read the 580 review and you will see he says it is basically "Fermi Done Right", setting the new standard for high-end temps and cooling. The 580 design put a ring on the fan to reduce the whine on the fan, so to go back to something as shrill and annoying is easily understood as taking a step in the wrong direction.
  • chizow - Tuesday, November 5, 2013 - link

    To add to that, there was a time 60mm and 80mm high RPM fans were the norm on CPU coolers, no longer, I could never stand for one of those high-pitched coolers again now what we are spoiled with 120mm/140mm CPU fans or multiple fans on radiators.
  • Drunktroop - Tuesday, November 5, 2013 - link

    I think you should be ready for noise when the maker is not willing to disclose power figures?

    Too bad Hawaii for SFF gaming is a no-go.
    From performance perspective it is unbeatable.

Log in

Don't have an account? Sign up now