GPU Performance: Intel Iris Plus Gen11

Complimenting Intel’s CPU upgrades for Ice Lake is an even more extensive upgrade on the GPU side of matters. The new chip architecture introduces Intel’s Gen 11 graphics, replacing the now very long in the tooth Gen 9.5 graphics, the core of which was first introduced back on Skylake all the way back in 2019. Gen 11, in turn, doesn’t turn Intel’s graphics ecosystem on its head (that will be Intel’s Xe in a couple of years), but it brings with it some much needed improvements in both architecture and overall performance.

The single biggest change is quite literally how big the integrated GPU is: while Intel has always invested a more-than-respectable amount of die space in its high-end laptop chips, Ice Lake is taking this even further. Thanks to the density improvements of their 10nm process as well as layout optimizations and pure silicon investment, Intel’s standard GT2 graphics configuration has become a lot bigger and a lot more powerful. A full GT2 is now 64 execution units (EUs), up from 24 in Skylake-era chips. To be sure, Intel has offered big GPUs before in their boutique GT3 and GT4 configurations, but the important part here is that we’re talking about the kind of mainstream chips that the Dells of the world are going to be buying in bulk.

With all of the horsepower available in their Gen11 graphics, Intel’s overall gaming ambitions are a lot higher. Until now, Intel’s standard integrated GPUs have always been serviceable and perfectly acceptable for mass market games like Rocket League that are designed to run on a wide array of hardware. But they’ve seldom been up to the task of handling AAA games, which these days are made using the Xbox One and Playstation 4 as their design baselines. So for Gen11, Intel is aiming to have graphics performance good enough to play more of these AAA games – even if it’s at the lowest quality setting.

Under the hood, driving these performance improvements are features like tile-based rendering, which help Intel further optimize their rendering pipeline by breaking up scenes to minimize the amount of memory bandwidth consumed. Speaking in broad strokes, previous Intel architectures have already implemented tiling, but Gen11 is the most advanced implementation yet, bringing Intel up to parity with NVIDIA and AMD. Similarly, Intel has further iterated on their lossless memory compression technology, squeezing out another 4% there. And, of course, Ice Lake just flat out gets a lot more memory bandwidth to play with: LPDDR4X-3733 is a whopping 75% increase in bandwidth over what LPDDR3-2133 could offer.

As for non-gaming use cases, Intel’s Gen11 integrated GPUs also pack in an updated media encode/decode block. Users are unlikely to directly notice many of these changes, but they will be felt in many use cases such as video playback battery life, where the doubling up of engines means that Intel can run their video decode block at lower, more power-sipping clockspeeds. Video encode users, on the other hand, may get more out of these changes, as Intel has designed a new HEVC encoder path that should significantly improve their encoding efficiency (and thus quality) at a given bitrate.

To see how the latest Gen11 graphics perform in the XPS 13, the laptop was run through our standard laptop GPU suite, as well as a few additional AAA games which we recently tested the Ryzen Surface Laptop 3 on to get a comparison against the latest Vega laptop GPU as well.

3DMark

3DMark offers a variety of tests of varying complexity, from the dGPU focused Fire Strike down to the tablet focused Ice Storm Unlimited. As the games get less GPU intensive, they tend to get more CPU bound as well.

Unfortunately for the XPS 13, there’s currently an Intel GPU driver bug which prevents these tests from completing. We reached out to Dell and they are hoping to have an updated driver available soon, which we’ll test at that time and update these results accordingly.

GFXBench

GFXBench 5.0 Aztec Ruins Normal 1080p Offscreen

GFXBench 5.0 Aztec Ruins High 1440p Offscreen

Version 5.0 of Kishonti’s GFXBench brought some new DirectX 12 tests to the table, which we’ve added to our testing suite. AMD’s experience with low-level APIs has helped them with DX12, and Vega reaps that reward, but Intel’s latest GPU is nipping at its heels. Interestingly, both AMD’s integrated Vega and Intel’s integrated Iris Plus GPU both surpass the dGPU MX150 and MX250 from NVIDIA in these results.

Dota 2

Dota 2 Reborn - Value

Dota 2 Reborn - Enthusiast

Valve’s Dota 2 online battleground game offers a wide range of playable systems, including integrated graphics. The game tends to be somewhat CPU bound as well, so a strong CPU can really up the framerate. Here the XPS 13 does very well, offering fantastic performance in our Value level settings, and even reasonable performance at 1920x1080 with all options enabled. This game would be very playable at 1920x1080 with the Iris Plus graphics with just a few settings tweaks. The extra CPU grunt allowed it to pull ahead of Vega here.

Tomb Raider

Tomb Raider - Value

Although a AAA title in its day, Tomb Raider can be playable on integrated graphics, especially now that both AMD and Intel offer quality iGPUs. The XPS 13 achieves almost 100 FPS in our value settings on this game, which is quite impressive.

Rise of the Tomb Raider

Rise of the Tomb Raider - Value

The sequel to Tomb Raider is significantly more demanding of the GPU, and as such the Iris Plus can’t quite cope. It falls slightly behind the Vega GPU in the Surface Laptop 3, but at under 30 FPS, neither would be particularly great to use on this game.

Civilization VI

Civilization VI Enthusiast

Unfortunately, due to the way Civ VI detects hardware, we ran into the same issue with it on the XPS 13 as we did on the Surface Laptop 3, where it would only run at the native resolution, which in this case is 1920x1200.

GPU Conclusion

Intel has been well behind in terms of GPU performance for some time, but without much competition there was not much to be said about it at the time. However since AMD came on the scene with their Ryzen mobile platform, the stakes have been raised. Ice Lake has proved to respond quite well, with graphics performance well beyond what Intel offered in their previous offerings, and able to trade blows with AMD’s Vega depending on the workload. It feels like AMD still has the advantage in terms of absolute GPU performance, but are likely held back by a less-capable CPU, so on games that are less GPU bound, the Ice Lake can pull ahead. Regardless, it’s a huge step up for Intel, and unlike previous 

versions of their Iris graphics, there seems to be a lot of devices being launched with the G7 graphics with the full 64 EUs. This is good news. The XPS 13 still isn’t going to challenge Dell’s Alienware brand for people looking at gaming laptops, but this is a major step forward, and brings Intel’s iGPU closer to par with AMD’s Vega.

System Performance: Unleashing Ice Lake Display Analysis
Comments Locked

108 Comments

View All Comments

  • sorten - Friday, November 15, 2019 - link

    I agree. AMD and Microsoft are establishing the relationship and the basic design. Next year we won't be comparing brand new Intel vs. 2 year old AMD in this space and it will be a very different result.
  • skavi - Friday, November 15, 2019 - link

    Unless AMD changes their APU strategy, it's still going to to an year old.
  • dr.denton - Sunday, November 17, 2019 - link

    Not to say Intel didn't do great work on this, but we shouldn't forget this is a comparison between Intel's absolute top shelf CPU and AMD's mid-range. Ice Lake vs. R7 3870 will tell a different story.
  • 1_rick - Friday, November 15, 2019 - link

    "the now very long in the tooth Gen 9.5 graphics, the core of which was first introduced back on Skylake all the way back in 2019. "

    Mmm, yeah, it's practically antediluvian.
  • ghanz - Saturday, November 16, 2019 - link

    Yeah, it should be 2016... A typo I guess. Almost like a modern day GMA950 by now.
  • skavi - Friday, November 15, 2019 - link

    what power settings were used for battery life testing? The same as for performance testing?
  • Brett Howse - Friday, November 15, 2019 - link

    Optimized + Best Battery Life on Windows
  • skavi - Friday, November 15, 2019 - link

    thanks!
  • 29a - Friday, November 15, 2019 - link

    It would be nice to see video conversion benchmarks ran on the iGPU/ASIC, this is something that needs to be added to reviews.
  • Brett Howse - Friday, November 15, 2019 - link

    Great suggestion I'll add that test.

Log in

Don't have an account? Sign up now