Metro: Last Light

Metro: Last Light is the latest entry in the Metro series of post-apocalyptic shooters by developer 4A Games. Like its processor, Last Light is a game that sets a high bar for visual quality, and at its highest settings an equally high bar for system requirements thanks to its advanced lighting system. This doesn’t preclude it from running on iGPUs thanks to the fact that it scales down rather well, but it does mean that we have to run at fairly low resolutions to get a playable framerate.

Metro: Last Light

Looking at desktop parts alone, Intel really suffers from not having a socketed GT3 SKU. Although HD 4600 is appreciably faster than HD 4000 (+30%), both Trinity and Richland are around 17% faster than it. As you'll see, Metro ends up being one of the smaller gaps between the two in our suite.

Metro: Last Light

As memory bandwidth becomes the ultimate bounding condition, the gap between Richland and Haswell shrinks considerably. Note that on the HD 4600 side, the difference between DDR3-1333 and DDR3-2400 is only 10% here. Given the limited performance of the 20 EU Haswell GPU configuration, it doesn't seem like Intel is all that bandwidth limited here.

BioShock: Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

BioShock: Infinite

If Metro was an example of the worst case scenario for Richland, BioShock: Infinite is the best case scenario. Here the Radeon HD 8670D holds a 50% performance advantage over Intel's HD 4600 graphics.

BioShock: Infinite

The gap narrows a bit at higher resolution/quality settings, but it's still 39%.

Sleeping Dogs

A Square Enix game, Sleeping Dogs is one of the few open world games to be released with any kind of benchmark, giving us a unique opportunity to benchmark an open world game. Like most console ports, Sleeping Dogs’ base assets are not extremely demanding, but it makes up for it with its interesting anti-aliasing implementation, a mix of FXAA and SSAA that at its highest settings does an impeccable job of removing jaggies. However by effectively rendering the game world multiple times over, it can also require a very powerful video card to drive these high AA modes.

Sleeping Dogs

Richland is approaching 60 fps in our Sleeping Dogs benchmark at medium quality, definitely not bad at all. The advantage over Intel's HD 4600 is 34%.

Sleeping Dogs

The performance advantage grows a bit at the higher quality/resolution settings, however we drop below the line of playability. With most of these games, you can trade off image quality for resolution however.

Tomb Raider (2013)

The simply titled Tomb Raider is the latest entry in the Tomb Raider franchise, making a clean break from past titles in plot, gameplay, and technology. Tomb Raider games have traditionally been technical marvels and the 2013 iteration is no different. iGPUs aren’t going to have quite enough power to use its marquee feature – DirectCompute accelerated hair physics (TressFX) – however even without it the game still looks quite good at its lower settings, while providing a challenge for our iGPUs.

Tomb Raider (2013)

Tomb Raider is another title that doesn't put Richland in the best light, but it still ends up around 23% faster than Haswell GT2.

Tomb Raider (2013)

Battlefield 3

Our multiplayer action game benchmark of choice is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers at the high-end, but it’s still a challenge for more entry-level GPUs such as the iGPUs found on Intel and AMD's latest parts. Our goal here is to crack 60fps in our benchmark, as our rule of thumb based on experience is that multiplayer framerates in intense firefights will bottom out at roughly half our benchmark average, so hitting medium-high framerates here is not necessarily high enough.

Battlefield 3

Richland's performance in Battlefield 3 climbs around 30% over the HD 4600 regardless of quality/resolution.

Battlefield 3

Battlefield 3

Crysis 3

With Crysis 3, Crytek has gone back to trying to kill computers, taking back the “most punishing game” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and the situation isn't too much better for entry-level GPUs at its lowest quality setting. In any case Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2013.

Crysis 3

Crysis is another benchmark where we see an increase in performance in the low 30% range.

Crysis 3

Crysis 3

 

Introduction Synthetics
Comments Locked

102 Comments

View All Comments

  • FriendlyUser - Thursday, June 6, 2013 - link

    Indeed, there is a $468 part. You can still fit a decent dGPU and a decent CPU on that budget for, once again, vastly superior performance. And you don't need crossfire but you do lose on power consumption, which is the only point the Iris has for it.
  • iwod - Thursday, June 6, 2013 - link

    I wonder how much discount do OEM generally gets from Intel. 30% off Tray $440 @ $308/chip ? If the CPU used to cost them $200 and $100 for the GPU, i guess the space saving of 2in1 solution, less power usage, while giving similar performance is going to be attractive enough.
  • testbug00 - Friday, June 7, 2013 - link

    My desktop costed less than that... Mine probably is a little slower even with 1.1Ghz GPU and 4.4 CPU (my A10-5800K w/ 1866 OCed to 2133)
  • Sabresiberian - Friday, June 7, 2013 - link

    Yah, for me, the only consideration for a system with on-die CPU graphics is if I buy a low-end notebook that I want to do a little gaming on, and the chips with Iris price themselves out of that market. I've recommended AMD for that kind of product to my friends before, and I don't see any reason to change that.
  • Sabresiberian - Friday, June 7, 2013 - link

    What does Crossfire have to do with it? Using on-die graphics with an added discrete card doesn't have anything to do with Crossfire.
  • max1001 - Friday, June 7, 2013 - link

    Because AMD like to call APU+GPU combo Hybird Crossfire.
  • Spunjji - Friday, June 7, 2013 - link

    Who said anything about Crossfire?!
  • MrSpadge - Thursday, June 6, 2013 - link

    No, Crystalwell also makes sense on any high-performance part. Be it the topmost dekstop K-series or the Xeons. That cache can add ~10% performance in quite a few applications, which equals 300 - 500 MHz more CPU clock. And at 300$ there'd easily be enough margin left for Intel. But no need to push such chips...
  • Gigaplex - Thursday, June 6, 2013 - link

    There isn't a single K-series part with Crystalwell.
  • mdular - Thursday, June 6, 2013 - link

    As others have already pointed out it's not the "most important information" at all. Crystalwell isn't available on a regular desktop socket.

    Most importantly though, that is also for a good reason: Who would buy it? At the price point of the Crystalwell equipped CPUs you would get hugely better gaming performance with an i3/i5/FX and a dedicated GPU. You can build an entire system from scratch for the same amount and game away with decent quality settings, often high - in full HD.

    There is a point to make for HTPCs, gaming laptops/laplets, but i would assume that they don't sell a lot of them at the Crystalwell performance target.

    Since the article is about Desktops however, and considering all of the above, Crystalwell is pretty irrelevant in this comparison. If you seek the info on Crystalwell performance i guess you will know where to find it.

Log in

Don't have an account? Sign up now