AMD's Gaming Evolved Application

During AMD’s “partner time” block at the 2014 GPU Product Showcase, one of the projects presented was the Raptr social networking and instant messaging application. Put together by the company of the same name, AMD would be partnering with Raptr to produce an AMD branded version of the utility called the “AMD Gaming Evolved App, Powered By Raptr”.

In a nutshell, the Gaming Evolved App (GEA) is AMD’s attempt to bring another value add feature to the Radeon brand. And although AMD will never explicitly say this, to be more specific the GEA is clearly intended to counter NVIDIA successful GeForce Experience utility, which exited beta back in May and has been continuing to add features since.

Raptr/GEA contains a wealth of functionality, with the application being several years old at this point, but the key feature as a video card utility and the reason AMD has picked it up is its latest feature addition, the game optimization service. Just launched last month in beta, the optimization service is a functional clone of GeForce Experience’s optimization service. Designed with the same goals in mind, the GEA optimization service is intended to offer the ability for gamers disinterested in configuring their games – or even just looking for a place to start – a way to simply download a suitable collection of settings for their games and hardware and apply those settings to their games.

The concept is in practice very similar to the recommended settings that most games apply today, but driven by the GPU manufacturer instead of the game developer, and kept up to date with hardware/software changes as opposed to being set in stone when the game went gold. Even for someone like a professional GPU reviewer, it’s a very nifty thing to have when turning up every setting isn’t going to be practical.

To get right to the point then, while we’re big fans of the concept it’s clear that this is a case of AMD tripping over themselves in trying to react to something NVIDIA has done, by trying to find the fastest way of achieving the same thing. Like GeForce Experience, AMD has started bundling GEA with their drivers and installing it by default, but unlike GFE it’s still in beta at this point, and a very rough beta at that. And not to take an unnecessary shot at AMD, but even in beta GeForce Experience wasn’t this raw or this incomplete.

So why are we so down on GEA? There are a few reasons, but the most basic of which is that the Raptr service lacks enough performance data for GEA to offer meaningful recommendations. Even on a fairly old card like a Radeon HD 7950, GEA was only able to find settings for 5 of the 11 games we have installed on our GPU testbed, failing to include settings for a number of games that are months (if not years) old. To be fair every service has to start out somewhere, and GFE certainly didn’t launch with a massive library of games, but 5 games, none newer than March, is a particularly bad showing.

Now a lot of this has to do with how Raptr collects the performance data it uses for recommendations. NVIDIA for their part decided to do everything in house, relying on their driver validation GPU farms to benchmark games across multiple settings to find a good balance based on parameters picked by the GFE development team. Raptr, though backed by AMD, does not have anything resembling NVIDIA’s GPU farms and as such is going the crowdsourced route, relying on telemetry taken from Raptr users’ computers. Raptr’s data acquisition method is not necessarily wrong, but it means there’s no one to bootstrap the service with data, which means the service has started out with essentially nothing.

Raptr for their part is aware of the problem they’re faced with, and in time the distribution of the GEA along with their own Raptr application will hopefully ensure that there are enough users playing enough games out there to collect the necessary data. Even so, they did have to implement what amounts to a solution to the tragedy of the commons problem to make sure that data gets collected; users cannot receive settings from the Raptr service unless they provide data in return. Turning off the telemetry service will also turn off the client’s ability to pull down settings, full stop. Given the service’s requirements for data collection it’s likely the best solution to the problem, but regardless we have to point out that Ratpr is alone in this requirement. NVIDIA can offer GFE without requiring performance telemetry from users.

Moving on then, the other showstopper with GEA’s current optimization service is that it’s obvious the UI has been an afterthought. The GEA UI lists settings by the values used in a game’s settings file, rather than the name of that value. E.g. “Ultra” texture quality in Bioshock Infinite is labeled as texture detail “4”, or worse. Without sufficient labeling it’s impossible to tell just what those settings mean, let alone what they may do. As such applying GEA settings right now is something of a shot in the dark, as you don’t know what you’re going to get.

Finally, presumably as a holdover from the fact that Raptr is free, GEA runs what can only be described as ads. These aren’t straight up advertisements, rather directing users towards other services Raptr/GEA provides, such as Free-2-Play games and a rewards service. But the end game is the same as these services are paid for by Raptr’s sponsors and are intended to drive users towards purchasing games and merchandise from those sponsors. Which far be it for us to look down upon advertisements – after all, AnandTech is ad supported – but there’s something to be said for ad supported applications in a driver download. We're at something of a loss for explaining why AMD doesn't just foot the complete bill on their customized version of the Raptr client and have the ads removed entirely.

At any rate we do have some faith that in time these issues can be dealt with and the GEA can essentially be fixed, but right now the GEA is far too raw for distribution. It needs to go back into development for another few months or so (and the service bootstrapped with many more computer configurations and games) before it’s going to be of suitable quality for inclusion in AMD’s drivers. Otherwise AMD is doing their users a disservice by distributing inferior, ad supported software alongside the software required to use their products.

The Test

For the launch of the Radeon R9 290, the press drivers and the launch drivers will be AMD’s recently released Catalyst 13.11 Beta v8 drivers. Along with containing support for the 290 and the 47% fan speed override, the only other changes in these drivers involve Batman: Arkham Origins and Battlefield 4, games which we aren’t using for this review. So the results will be consistent with past drivers. Meanwhile for NVIDIA’s cards we’re continuing to use their release 331.58 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
AMD Radeon R9 290
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
AMD Catalyst 13.11 Beta v8
OS: Windows 8.1 Pro

 

AMD's Last Minute 290 Revision & Meet The Radeon R9 290 Metro: Last Light
Comments Locked

295 Comments

View All Comments

  • HisDivineOrder - Tuesday, November 5, 2013 - link

    Haha, spoken like someone who's never heard a card this loud. I can't wait to see all these cards on sale on ebay and forums everywhere. "I tried it and it's not for me, sidegrading to a 780," they'll say.

    This card is so loud you're going to be shocked by it. It's going to blow people's minds and it may even convert a few fanboys.
  • Finally - Tuesday, November 5, 2013 - link

    If he buys one with a nice custom fan, there won't be anything left to complain about. Truly terrible outlook for an Nvidiot, isn't it?
  • TheJian - Tuesday, November 5, 2013 - link

    You're forgetting they are using ref NV also. You don't get that when you buy an NV card and they come overclocked on top of quiet. Also this thing will draw the same watts no matter what. It remains to be seen how good a different cooler will actually be. Did AMD really choose two terrible fans for their product launch? Seriously? I'm wondering how much they can really fix this situation. AMD had to know this would cause bad reviews about noise nearly everywhere and even on AMD loving sites. I can't believe they are completely dumb, and chose a total piece of junk for the fan/heatsink here. I really think people are putting to much faith in a fix with a fan change. They are at 95 all day basically, how much fan do you need to fix that?

    If NV runs their gpus at 95 tomorrow (and cranked up even more to meet the noise they're getting here) these cards will both be spanked. You get a better cooler on NV cards that are NOT ref also.
  • jnad32 - Tuesday, November 5, 2013 - link

    The way I look at it, AMD is looking like an absolute genius. Everyone was ripping them on the 290X for it being too hot and too loud anyway. So instead of keeping the sound levels down they just went for what they do best, price/performance. They are now blowing every other card out of the water. There isn't a card on the planet that can touch this card in price/performance. Yea its loud as hell but, at least you have to think about it now just because of the price. What I really want to see is them unleash the 290X sound threshold and see what kind of raw numbers it can put up. Lets be honest, the only people who should buy reference cards are the ones who are putting water blocks on them.

    People have been saying this about the temp since launch, and I still don't get it. If AMD designed the chip to run at those temps, what's the big deal as long as it's not damaging it.
  • swing848 - Tuesday, November 5, 2013 - link

    It will only get loud for me when playing games or the occasional benchmark. During games I wear headphones, and during benchmarks I can leave the room. I have a room dedicated to computer use and the house has good sound proofing, so, it will not bother other people.

    If I want it quiet I will use a water cooler with a large radiator and fan.

    It is better than dumping all the hot air from the video card into my case, even if it is well cooled with 200mm fans. I overclock my CPU and I do not want it, RAM, or chips on the motherboard to get any hotter than necessary.
  • zeock9 - Tuesday, November 5, 2013 - link

    The burning question on my mind at this point is why AMD is restricting board partners from releasing their own custom designed and obviously better performing coolers on this otherwise fantastic card?
  • techkitsune - Tuesday, November 5, 2013 - link

    They likely don't want to look bad.

    It's okay. It's tough doing thermal management. I cram 1,000w of LED into a 30mm x 30mm space. AMD doesn't have the cooling problems that I have. Nor does nVidia nor intel. They should be grateful. :D
  • HisDivineOrder - Tuesday, November 5, 2013 - link

    They don't have them yet. That's why they haven't made custom boards. They're just getting them right now. They're going with what they have, which right now are just the reference boards. In a month or so, they'll have QA'ed some solutions with pre-existing cooling options, assuming said cooling options are good enough to benefit these cards.

    The thing is, you have to know these cards are running REALLY, REALLY hot to hit these levels at 95 degrees, so... custom coolers may have a hard time handling these cards without some tweaks. Perhaps to get faster fans on there.

    Also, it takes time to redesign a board to add VRM's and the 290 and 290X are still very, very new. You're not going to get an MSI Lightning version overnight.

    It's a solid deal in price, but man it's a shame AMD didn't offer a better custom cooler more attuned to the very special needs of the 290 series. It's also a shame their board is being pushed so hard and so much above what it seems capable of doing with reasonable power levels.

    This is like the Bulldozer of GPU's.
  • techkitsune - Tuesday, November 5, 2013 - link

    AMD could have just spent a few more dollars and used copper instead of aluminum, I would think. They could have easily doubled or tripled thermal conductivity and thus not needed to run the reference cooler anywhere near as high, plus that would leave a LOT of extra overclocking room.

    I still would buy it for the extra $45 that would have likely entailed, though I do worry about weight at that point. My 9800 GTX+ was pretty hefty, to say the least.
  • TheinsanegamerN - Tuesday, November 5, 2013 - link

    THIS. why does amd, or heck, any manufacturer, insist on using aluminum fins on a 250 watt+ gpu? my old amd 2600xt had a full copper heatsink, and it was nowhere near as power hungry as this card (and it ran cool to boot. never over 47c).
    use the exact same heatsink, but make those fins copper. wonder how much lower the temps would go?

Log in

Don't have an account? Sign up now