Gaming Benchmarks: High End

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

When using the R9 290X, the AMD APUs (except for the 7850K) are within a few percentage points for average frame rates compared to similarly priced Intel processors. You need to spend $200 to get a +30% increase in frame rates. That being said, the frame rates in all our results (except the A8-6500T) were all above 120 FPS on the GTX 980, albeit with the G3258 and i3-4130T ahead by ~10%.

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)

With both of our high end cards, the true quad cores from Intel give performance gains but the better performer here is the G3258 when considering price.

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380) Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380) [Under 60 FPS]

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560) Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560) [Under 60 FPS]

The A8-7650K remains competitive in GTA around the $100 price point, but the i3-4130T for an extra $17 gets a slightly better score.

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380) GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560) GRID: Autosport on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]

GRID, as we saw on the other testing, still seems optimised for Intel CPUs. The minimum frame rates are the key metric here, especially on the R9 290X.

Middle-Earth: Shadows of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadows of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and SiM received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380) Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]

Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560) Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]

SoM at 1080p shows small differences between AMD and Intel here, though you might be hard pressed to notice them.

Middle-Earth: Shadows of Mordor at 4K

Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380) Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]

Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560) Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]

Moving to 4K evens out the average frame rates on all our CPUs, although a slight latter is appearing on the minimum frame rates.

Middle-Earth: Shadows of Mordor CrossFire at 4K

Shadows of Mordor on 2x MSI R9 290X Gaming LE 4GB ($380) Shadows of Mordor on 2x MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]

Bumping up to dual R9 290Xs shows that the A8-7650K is highly competitive in both average and minimum frame rates.

Gaming Benchmarks: GTX 770 and R9 285 AMD A8-7650K Conclusion
Comments Locked

177 Comments

View All Comments

  • TrackSmart - Wednesday, May 13, 2015 - link

    This comment is for Ian Cutress,

    First, thank you for the review, which was rich with performance figures and information. That said, something seems missing in the Conclusion. To be precise, the article doesn't really have a clear conclusion or recommendation, which is what many of us come here for.

    It's nice to hear about your cousin-in-law's good experiences, but the conclusion doesn't clearly answer the key question I think many readers might have: Where does this product fit in the world of options to consider when buying a new processor? Is it a good value in its price range? Should it be ignored unless you plan to use the integrated graphics for gaming? Or does it offer enough bang-for-the-buck to be a viable alternative to Intel's options for general non-gaming usage, especially if motherboard costs are considered? Should we consider AMD again, if we are in a particular niche of price and desired features?

    Basically, after all of your time with this chip and with your broader knowledge of the market offerings, what is your expert interpretation of the merits or demerits of considering this processor or its closely related AMD peers?
  • Nfarce - Thursday, May 14, 2015 - link

    " Ultimately AMD likes to promote that for a similarly priced Intel+NVIDIA solution, a user can enable dual graphics with an APU+R7 discrete card for better performance."

    I have *long* wondered why Intel and Nvidia don't get together and figure out a way to pair up the on-board graphics power of their CPUs with a discrete Nvidia GPU. It just seems to me such a waste for those of us who build our rigs for discrete video cards and just disable the on-board graphics of the CPU. Game developers could code their games based on this as well for better performance. Right now game developer Slightly Mad Studios claims their Project Cars racing simulation draws PhysX from the CPU and not a dedicated GPU. However, I have yet to find that definitively true based on benchmarks...I see no difference in performance between moving PhysX resources to my GPUs (970 SLI) or CPU (4690K 4.7GHz) in the Nvidia control panel in that game.
  • V900 - Thursday, May 14, 2015 - link

    Something similar to what you're describing is coming in DX12...

    But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense.

    First if all, the performance you get from an Intel igpu in a desktop system will be minimal, compared to even a 2-300$ Nvidia card. And secondly, if you crank up the igpu on an Intel CPU, it may take away some of the CPUs performance/overhead.

    If we're talking about a laptop, taking watts away from the CPU, and overall negatively impacting battery life will be even bigger drawbacks.
  • Nfarce - Thursday, May 14, 2015 - link

    "But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense."

    Did you even read the hardware I have? Further, reading benchmarks from the built in 4600 graphics of i3/i5/i7 CPUs shows me that it is a wasted resource. And regarding impact on CPU performance, considering that higher resolutions (1440p and 4K) and higher quality/AA settings are more dependent on GPU performance than CPU performance, the theory that utilizing onboard CPU graphics with a dedicated GPU would decrease overall performance is debatable. I see little gains in my highly overclocked 4690K running at 4.7GHz and running at the stock 3.9GHz turbo frequency in most games.

    All we have to go on currently is 1) Intel HD 4600 performance alone in games, and 2) CPU performance demands at higher resolutions on games with dedicated cards.
  • UtilityMax - Friday, May 15, 2015 - link

    I am guessing that they didn't get together because dual-graphics is very difficult to make to work right. AMD is putting effectively the same type of GPU cores on the discrete GNUs and integrated APUs, and it still took them a while to make it work at all.
  • V900 - Thursday, May 14, 2015 - link

    I guess one thing we all learned today, besides the fact that AMDs APUs still kinda blow, is that there is a handful of people, who are devoted enough to their favorite processor manufacturer to seriously believe that:

    A: Intel is some kind of evil and corrupt empire ala Star Wars.

    B: They're powerful enough to bribe/otherwise silence "da twooth" among all of Anandtech and most of the industry.

    C: 95% of the tech press is corrupt enough to gladly do their bidding.

    D: Mantle was an API hardcoded by Jesus Christ himself in assembler language. It's so powerful that if it got widespread, no one would need to buy a new CPU or GPU the rest of this decade. Which is why "they" forced
  • V900 - Thursday, May 14, 2015 - link

    Which is why "they" forced AMD to cancel Mantle. Then Microsoft totally 110% copied it and renamed it "DX12".

    Obviously all of the above is 100% logical, makes total sense and is much more likely than AMD releasing shoddy CPUs the last decade, and the press acknowledging that.
  • wingless - Thursday, May 14, 2015 - link

    AMD DOMINATION!!!!! If only the charts looked like that with discrete graphics as well....
  • Vayra - Friday, May 15, 2015 - link

    Still really can't see a scenario where the APU would be the best choice. Well, there may be one: for those with a very tight budget and wish for playing games on PC regardless. But this would mean that AMD has designed and reiterated a product that would only find its market in the least interesting group of consumers: those that want everything for nothing... Not really where you want to be.
  • UtilityMax - Friday, May 15, 2015 - link

    Well, right now arguably, if one has $500 bucks or less for a gaming PC build, it would be better to buy a Playstation 4. High end builds is where the money is in the enthusiast gaming market.

Log in

Don't have an account? Sign up now