Gaming Benchmarks: Low End

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on Integrated Graphics

Alien Isolation on ASUS R7 240 DDR3 2GB ($70)

Alien Isolation on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics

When it comes to integrated graphics, the APUs are ruling the roost. The A8-7650K sits in its stack where it should, between the A10-7800 and the A8-7600. When we use a low end GPU, all our CPUs perform similarly showing that this benchmark is more GPU limited at this level. In Dual Graphics mode, the frame rate moves up to just under double the integrated value.

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on Integrated Graphics

Total War: Attila on ASUS R7 240 DDR3 2GB ($70)

Total War: Attila on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics

Similarly with Attila, the AMD APUs at $100 beat an Intel IGP at $340. When we move to the R7 240 however, the Intel CPUs actually have a slight advantage, perhaps showing that Attila needs CPU performance here. Again, dual graphics mode offers almost double the frame rate, almost hitting 60 FPS.

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on Integrated Graphics Grand Theft Auto V on Integrated Graphics [Under 60 FPS]

Grand Theft Auto V on ASUS R7 240 DDR3 2GB ($70) Grand Theft Auto V on ASUS R7 240 DDR3 2GB ($70) [Under 60 FPS]

Grand Theft Auto V on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics Grand Theft Auto V on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics [Under 60 FPS]

One of the surprises in this review, for me, was the GTA performance. Here we have a $105W APU that easily breaks through the 30 FPS barrier in our low GPU setting, almost hitting 50 FPS average. the graph on the right shows the percentage of frames under 60 FPS (or over 16.6 ms), and it's clear that at these settings more horsepower is needed. Using the R7 240 gave a slightly different story, although the $105 APU is inbetween the $72 and $122 Intel CPUs.

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on Integrated Graphics GRID: Autosport on Integrated Graphics [Minimum FPS]

GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) [Minimum FPS]

GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics [Minimum FPS]

Codemaster's racing engines historically like as much as can be thrown at it, and on integrated graphics it is a clear win for AMD's parts, getting almost double the frame rate. This was perhaps expected, knowing how AMD partitions more of its die area to graphics. When we stick in the R7 240, the difference becomes negligable, and only a small rise is seen from dual graphics.

Middle-Earth: Shadows of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadows of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadows of Mordor on Integrated Graphics Shadows of Mordor on Integrated Graphics [Minimum FPS]

Shadows of Mordor on ASUS R7 240 DDR3 2GB ($70) Shadows of Mordor on ASUS R7 240 DDR3 2GB ($70) [Minimum FPS]

Shadows of Mordor on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics Shadows of Mordor on ASUS R7 240 DDR3 2GB ($70) with Dual Graphics [Minimum FPS]

Similar to GRID, Mordor loves the integrated graphics with a clear margin from Intel to AMD in average frame rate and minimum frame rate. With the R7 240, all GPUs are more-or-less equal, although Intel has the upper hand in minimum frame rates. Dual graphics mode gives a good boost to the average frame rates, moving from 44 FPS to 64 FPS on the A8-7650K.

Professional Performance: Linux Gaming Benchmarks: GTX 770 and R9 285
Comments Locked

177 Comments

View All Comments

  • TrackSmart - Wednesday, May 13, 2015 - link

    This comment is for Ian Cutress,

    First, thank you for the review, which was rich with performance figures and information. That said, something seems missing in the Conclusion. To be precise, the article doesn't really have a clear conclusion or recommendation, which is what many of us come here for.

    It's nice to hear about your cousin-in-law's good experiences, but the conclusion doesn't clearly answer the key question I think many readers might have: Where does this product fit in the world of options to consider when buying a new processor? Is it a good value in its price range? Should it be ignored unless you plan to use the integrated graphics for gaming? Or does it offer enough bang-for-the-buck to be a viable alternative to Intel's options for general non-gaming usage, especially if motherboard costs are considered? Should we consider AMD again, if we are in a particular niche of price and desired features?

    Basically, after all of your time with this chip and with your broader knowledge of the market offerings, what is your expert interpretation of the merits or demerits of considering this processor or its closely related AMD peers?
  • Nfarce - Thursday, May 14, 2015 - link

    " Ultimately AMD likes to promote that for a similarly priced Intel+NVIDIA solution, a user can enable dual graphics with an APU+R7 discrete card for better performance."

    I have *long* wondered why Intel and Nvidia don't get together and figure out a way to pair up the on-board graphics power of their CPUs with a discrete Nvidia GPU. It just seems to me such a waste for those of us who build our rigs for discrete video cards and just disable the on-board graphics of the CPU. Game developers could code their games based on this as well for better performance. Right now game developer Slightly Mad Studios claims their Project Cars racing simulation draws PhysX from the CPU and not a dedicated GPU. However, I have yet to find that definitively true based on benchmarks...I see no difference in performance between moving PhysX resources to my GPUs (970 SLI) or CPU (4690K 4.7GHz) in the Nvidia control panel in that game.
  • V900 - Thursday, May 14, 2015 - link

    Something similar to what you're describing is coming in DX12...

    But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense.

    First if all, the performance you get from an Intel igpu in a desktop system will be minimal, compared to even a 2-300$ Nvidia card. And secondly, if you crank up the igpu on an Intel CPU, it may take away some of the CPUs performance/overhead.

    If we're talking about a laptop, taking watts away from the CPU, and overall negatively impacting battery life will be even bigger drawbacks.
  • Nfarce - Thursday, May 14, 2015 - link

    "But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense."

    Did you even read the hardware I have? Further, reading benchmarks from the built in 4600 graphics of i3/i5/i7 CPUs shows me that it is a wasted resource. And regarding impact on CPU performance, considering that higher resolutions (1440p and 4K) and higher quality/AA settings are more dependent on GPU performance than CPU performance, the theory that utilizing onboard CPU graphics with a dedicated GPU would decrease overall performance is debatable. I see little gains in my highly overclocked 4690K running at 4.7GHz and running at the stock 3.9GHz turbo frequency in most games.

    All we have to go on currently is 1) Intel HD 4600 performance alone in games, and 2) CPU performance demands at higher resolutions on games with dedicated cards.
  • UtilityMax - Friday, May 15, 2015 - link

    I am guessing that they didn't get together because dual-graphics is very difficult to make to work right. AMD is putting effectively the same type of GPU cores on the discrete GNUs and integrated APUs, and it still took them a while to make it work at all.
  • V900 - Thursday, May 14, 2015 - link

    I guess one thing we all learned today, besides the fact that AMDs APUs still kinda blow, is that there is a handful of people, who are devoted enough to their favorite processor manufacturer to seriously believe that:

    A: Intel is some kind of evil and corrupt empire ala Star Wars.

    B: They're powerful enough to bribe/otherwise silence "da twooth" among all of Anandtech and most of the industry.

    C: 95% of the tech press is corrupt enough to gladly do their bidding.

    D: Mantle was an API hardcoded by Jesus Christ himself in assembler language. It's so powerful that if it got widespread, no one would need to buy a new CPU or GPU the rest of this decade. Which is why "they" forced
  • V900 - Thursday, May 14, 2015 - link

    Which is why "they" forced AMD to cancel Mantle. Then Microsoft totally 110% copied it and renamed it "DX12".

    Obviously all of the above is 100% logical, makes total sense and is much more likely than AMD releasing shoddy CPUs the last decade, and the press acknowledging that.
  • wingless - Thursday, May 14, 2015 - link

    AMD DOMINATION!!!!! If only the charts looked like that with discrete graphics as well....
  • Vayra - Friday, May 15, 2015 - link

    Still really can't see a scenario where the APU would be the best choice. Well, there may be one: for those with a very tight budget and wish for playing games on PC regardless. But this would mean that AMD has designed and reiterated a product that would only find its market in the least interesting group of consumers: those that want everything for nothing... Not really where you want to be.
  • UtilityMax - Friday, May 15, 2015 - link

    Well, right now arguably, if one has $500 bucks or less for a gaming PC build, it would be better to buy a Playstation 4. High end builds is where the money is in the enthusiast gaming market.

Log in

Don't have an account? Sign up now