Crysis: Warhead

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008. For an iGPU to even run Crysis at a playable framerate is a significant accomplishment, and even more so if it can do so at better than performance (low) quality settings.

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis sets the tone for a lot of what we'll see in this performance review. The Radeon HD 7660D on AMD's A10-5800K boosts performance by around 15 - 26% over the top end Llano part. The smaller, Radeon HD 7560D GPU manages a small increase over the top-end Llano at worst, and at best pulls ahead by 18%.

Compared to Ivy Bridge, well, there's no comparison. Trinity is significantly faster than Intel's HD 4000, and compared to HD 2500 the advantage is tremendous.

 

Metro 2033

Our next graphics test is Metro 2033, another graphically challenging game. Like Crysis this is a game that is traditionally unplayable on many integrated GPUs, even in DX9 mode.

Metro 2033

Metro 2033

Metro 2033

Metro 2033 shows us a 6 - 13% performance advantage for the top end Trinity part compared to Llano. The advantage over Intel's HD 4000 ranges from 20 - 40% depending on the resolution/quality settings. In general AMD is able to either deliver the same performance at much better quality or better performance at the same quality as Ivy Bridge.

The more important comparison is looking at the A8-5600K vs. Intel's HD 4000 and 2500. AMD is still able to hold onto a significant advantage there, even with its core-reduced GPU.

Introduction DiRT 3 & Shogun 2 Performance
Comments Locked

139 Comments

View All Comments

  • Arbie - Thursday, September 27, 2012 - link


    For all the reasons you listed, Crysis Warhead is very much worth keeping in the mix. Personally, it's one of the few games I return to and easily the best of all of them. I'm very interested in how the new chips run it.

    Thanks.
  • SanX - Thursday, September 27, 2012 - link

    Make these processors capable of 2, 4, 6, 8-chip configurations and make appropriate cheap motherboards to sell the processors by shovels.

    They will be happy, we will be happy. Intel will be in trouble.
    Indeed, 32-core PC for less then $1000 !
  • calzahe - Thursday, September 27, 2012 - link

    The main issue with Trinity is that it is basically almost the same as Liano, just cosmetic improvements in architecture like VLIW5 -> VLIW4 in GPU and new X86 Piledriver cores... But the number of streaming processors reduced from 400 to 384 and memory controller has still only 2 channels.

    The problem for AMD is that they don't understand that people who could buy APU to play games don't want to stick with low graphics settings in games and prefer to add extra $ to buy external graphics card and set everything in High in games. And the people who don't play games buy Intel Ivy Bridge because it consumes less energy and is less noisy.

    To make next gen Kaveri APU attractive AMD should make it with minimum 800 streaming processors and memory controller should have 4 memory channels with DDR4 support. Otherwise Intel's Haswell will destroy AMD completely next year...

    As for Laptops Market the Ivy Bridge has similar performance as Trinity but provides much longer battery life. So the solution for AMD again to make APU with 800 or more streaming processors and 4 channel memory controller - it will not give 10 hours battery life but anyway combined with effective idle cores switching-off will be more effecient in power saving than CPU + descrete graphics card. So many people will buy these laptops for gaming and HD Movies.

    Regarding the Tablets/Smartphones market, AMD should accept the fact that the GloFo/TSMC 32nm/28nm manufacturing processes are inferior to Intel's 22nm. So unless GloFo will be on par with Intel in 14nm in 2014 (what is highly unlikely) AMD has no chances against Intel. That's why instead of wasting a lot of money and resources on Brazos they should licence ARM architecture and combine it with Radeon cores what can be quite competitive or even better than Tegra or Snapdragon.

    If AMD doesn't make improvements quickly than in 1-2 years they will be sold out or bankrupt.
  • silverblue - Thursday, September 27, 2012 - link

    Do you realise that once AMD implements its HSA initiative (along with perhaps on-die memory), it won't actually need a 4-channel memory bus? Faster clocked RAM is a must, though.

    In any case, people who buy APUs aren't in fact after bleeding edge performance but something affordable that doesn't perform like a dog. Add an external GPU if you like but that's really Vishera's area (and the dual module CPUs have no GPUs and as such will overclock better - Trinity's CPU cores could be more of a hindrance here).
  • calzahe - Thursday, September 27, 2012 - link

    HSA will not help much if used with 2 channel memory, on-die memory or 3D memory stacking will happen best case at 14nm due to transistor budget restriction. But AMD woud be able to use faster clocked DDR4 with 4 channel memory controller even next year without much effort.

    Those who buy discrete graphics cards usually buy them together with Intel CPUs and Vishera will not change this, and also lots of people prefer Nvidia cards over AMD. So to make some competition AMD should combine middle or even high end GPUs with 4-8 x86 cores into APU and use faster clocked DDR4 with 4 channel memory controller and sell the APUs for 200-400usd - it'll be more energy efficient and cheaper than paying 200-300usd for Intel CPU plus 250-500usd for good graphics card and what's the most important is that AMD has all the technologies and resources to make this happen even next year just a correct management decision is required...
  • wenbo - Thursday, October 4, 2012 - link

    You make it sound so easy. If it is that easy, people would have done that already.
  • wwwcd - Thursday, September 27, 2012 - link

    I agreed for that AMD's all desktop platforms need of 4 channel memory controller, but I thing than this option must release immediately...Fact is DDR4 for desktop have not will before Y2015. Four channel with high frequency it's enough for for the present.
  • wwwcd - Thursday, September 27, 2012 - link

    I agreed for that AMD's all desktop platforms need of 4 channel memory controller, but I thing than this option must release immediately...Fact is DDR4 for desktop have not will before Y2015. Four channel with high frequency it's enough for the present....

    Edit some errors;) ...With DDR3
  • silverblue - Friday, September 28, 2012 - link

    It's also not cheap to implement. One of the reasons the top-end Intel boards are so expensive, I expect. I think it'd be better to go for higher speed first and foremost.

    The extra bandwidth could let the CPU breathe a little better as well as open up GPU performance at higher detail levels, however I'm not sure it'll be the massive boost people are hoping for. Keeping a 384-shader GPU means you'll get potentially HD 4830/4770 performance, with the added bonus of more RAM than either of those two cards, however Trinity isn't THAT bandwidth constrained - adding more shaders would certainly alter that picture.
  • kyuu - Friday, September 28, 2012 - link

    "The problem for AMD is that they don't understand that people who could buy APU to play games don't want to stick with low graphics settings in games and prefer to add extra $ to buy external graphics card and set everything in High in games."

    This sentence makes no sense. If someone is looking at buying an APU, then they aren't looking at a discrete GPU setup and obviously aren't looking to run games at max settings. And, contrary to what a lot of people seem to think, a lot of people don't care about running the latest-and-greatest at max settings.

    Obviously, for an enthusiast gamer, Trinity doesn't make a whole lot of sense on the desktop (unless possibly they get asymmetrical crossfire working really well). But in the mobile arena, Trinity makes a lot of sense, giving respectable gaming prowess for significantly cheaper than an Intel CPU and discrete GPU combination as well as superior gaming battery life.

    What I'm most looking forward to is a tablet of Surface quality with a low-voltage Trinity powering it.

    No doubt more memory bandwidth would be greatly beneficial to AMD's APUs, but it's not as simple as just going to 4 channel memory. That increases the cost of the motherboard as well as paying for four sticks of memory, and it may not be practical in the mobile arena (which is where Trinity most shines, asides from HTPC duty).

Log in

Don't have an account? Sign up now