GPU Performance: Intel Iris Plus Gen11

Complimenting Intel’s CPU upgrades for Ice Lake is an even more extensive upgrade on the GPU side of matters. The new chip architecture introduces Intel’s Gen 11 graphics, replacing the now very long in the tooth Gen 9.5 graphics, the core of which was first introduced back on Skylake all the way back in 2019. Gen 11, in turn, doesn’t turn Intel’s graphics ecosystem on its head (that will be Intel’s Xe in a couple of years), but it brings with it some much needed improvements in both architecture and overall performance.

The single biggest change is quite literally how big the integrated GPU is: while Intel has always invested a more-than-respectable amount of die space in its high-end laptop chips, Ice Lake is taking this even further. Thanks to the density improvements of their 10nm process as well as layout optimizations and pure silicon investment, Intel’s standard GT2 graphics configuration has become a lot bigger and a lot more powerful. A full GT2 is now 64 execution units (EUs), up from 24 in Skylake-era chips. To be sure, Intel has offered big GPUs before in their boutique GT3 and GT4 configurations, but the important part here is that we’re talking about the kind of mainstream chips that the Dells of the world are going to be buying in bulk.

With all of the horsepower available in their Gen11 graphics, Intel’s overall gaming ambitions are a lot higher. Until now, Intel’s standard integrated GPUs have always been serviceable and perfectly acceptable for mass market games like Rocket League that are designed to run on a wide array of hardware. But they’ve seldom been up to the task of handling AAA games, which these days are made using the Xbox One and Playstation 4 as their design baselines. So for Gen11, Intel is aiming to have graphics performance good enough to play more of these AAA games – even if it’s at the lowest quality setting.

Under the hood, driving these performance improvements are features like tile-based rendering, which help Intel further optimize their rendering pipeline by breaking up scenes to minimize the amount of memory bandwidth consumed. Speaking in broad strokes, previous Intel architectures have already implemented tiling, but Gen11 is the most advanced implementation yet, bringing Intel up to parity with NVIDIA and AMD. Similarly, Intel has further iterated on their lossless memory compression technology, squeezing out another 4% there. And, of course, Ice Lake just flat out gets a lot more memory bandwidth to play with: LPDDR4X-3733 is a whopping 75% increase in bandwidth over what LPDDR3-2133 could offer.

As for non-gaming use cases, Intel’s Gen11 integrated GPUs also pack in an updated media encode/decode block. Users are unlikely to directly notice many of these changes, but they will be felt in many use cases such as video playback battery life, where the doubling up of engines means that Intel can run their video decode block at lower, more power-sipping clockspeeds. Video encode users, on the other hand, may get more out of these changes, as Intel has designed a new HEVC encoder path that should significantly improve their encoding efficiency (and thus quality) at a given bitrate.

To see how the latest Gen11 graphics perform in the XPS 13, the laptop was run through our standard laptop GPU suite, as well as a few additional AAA games which we recently tested the Ryzen Surface Laptop 3 on to get a comparison against the latest Vega laptop GPU as well.

3DMark

3DMark offers a variety of tests of varying complexity, from the dGPU focused Fire Strike down to the tablet focused Ice Storm Unlimited. As the games get less GPU intensive, they tend to get more CPU bound as well.

Unfortunately for the XPS 13, there’s currently an Intel GPU driver bug which prevents these tests from completing. We reached out to Dell and they are hoping to have an updated driver available soon, which we’ll test at that time and update these results accordingly.

GFXBench

GFXBench 5.0 Aztec Ruins Normal 1080p Offscreen

GFXBench 5.0 Aztec Ruins High 1440p Offscreen

Version 5.0 of Kishonti’s GFXBench brought some new DirectX 12 tests to the table, which we’ve added to our testing suite. AMD’s experience with low-level APIs has helped them with DX12, and Vega reaps that reward, but Intel’s latest GPU is nipping at its heels. Interestingly, both AMD’s integrated Vega and Intel’s integrated Iris Plus GPU both surpass the dGPU MX150 and MX250 from NVIDIA in these results.

Dota 2

Dota 2 Reborn - Value

Dota 2 Reborn - Enthusiast

Valve’s Dota 2 online battleground game offers a wide range of playable systems, including integrated graphics. The game tends to be somewhat CPU bound as well, so a strong CPU can really up the framerate. Here the XPS 13 does very well, offering fantastic performance in our Value level settings, and even reasonable performance at 1920x1080 with all options enabled. This game would be very playable at 1920x1080 with the Iris Plus graphics with just a few settings tweaks. The extra CPU grunt allowed it to pull ahead of Vega here.

Tomb Raider

Tomb Raider - Value

Although a AAA title in its day, Tomb Raider can be playable on integrated graphics, especially now that both AMD and Intel offer quality iGPUs. The XPS 13 achieves almost 100 FPS in our value settings on this game, which is quite impressive.

Rise of the Tomb Raider

Rise of the Tomb Raider - Value

The sequel to Tomb Raider is significantly more demanding of the GPU, and as such the Iris Plus can’t quite cope. It falls slightly behind the Vega GPU in the Surface Laptop 3, but at under 30 FPS, neither would be particularly great to use on this game.

Civilization VI

Civilization VI Enthusiast

Unfortunately, due to the way Civ VI detects hardware, we ran into the same issue with it on the XPS 13 as we did on the Surface Laptop 3, where it would only run at the native resolution, which in this case is 1920x1200.

GPU Conclusion

Intel has been well behind in terms of GPU performance for some time, but without much competition there was not much to be said about it at the time. However since AMD came on the scene with their Ryzen mobile platform, the stakes have been raised. Ice Lake has proved to respond quite well, with graphics performance well beyond what Intel offered in their previous offerings, and able to trade blows with AMD’s Vega depending on the workload. It feels like AMD still has the advantage in terms of absolute GPU performance, but are likely held back by a less-capable CPU, so on games that are less GPU bound, the Ice Lake can pull ahead. Regardless, it’s a huge step up for Intel, and unlike previous 

versions of their Iris graphics, there seems to be a lot of devices being launched with the G7 graphics with the full 64 EUs. This is good news. The XPS 13 still isn’t going to challenge Dell’s Alienware brand for people looking at gaming laptops, but this is a major step forward, and brings Intel’s iGPU closer to par with AMD’s Vega.

System Performance: Unleashing Ice Lake Display Analysis
Comments Locked

108 Comments

View All Comments

  • abufrejoval - Friday, November 15, 2019 - link

    First impression: This sure doesn't disappoint!

    1900x1200 makes a ton of sense (4k at 13" much less to my eyes), real CPU performance is better than Whiskey without draining the bottle, GPU performance is where I expected from a Skylake Iris 550 notebook I own, physical design sounds great... Perhaps I'd wait for a Lenovo variant, because I do type a lot.

    With the higher-resolution, touch screen and a pen I'm not sure I'd ever be able to get it back from my daughter who can paint for hours even on a 6" mobile phone.

    The most welcome surprise seemed the price: $1500 for 16GB and an i7 seems downright reasonable for what is most likely the current high-end.

    Alas, when I went into the local (EU) configurator and added 32GB RAM, that added €1000 for what is essentially a €50 item (16 GB SO-DIMM). Sure it also added a 1TB NVME (€100 total or another €50 for the delta) and a 4k display I don't care about, but at that point I can't but call it the usual rip-off: I like my 4k at 42" and storage to be replacable.

    So I'll hold back and onto my Lenovo S730 (16GB RAM, 1TB Samsung, AX200) for €1200 in May a little longer. That one also has an additional USB-C port in addition to the two Thunderbolts and that turns out to be really useful day-to-day, especially if your (mini) TB dock doesn't supply power, too.

    But I note with satisfaction that at least in the mobile space Intel is still able to execute and I wish them well, while I won't remotely consider any Intel while there is Rome in the datacentre.
  • MASSAMKULABOX - Wednesday, November 20, 2019 - link

    I think these are priced for the Business Market .. and its "car" pricing .. the base model is surprisingly cheap, but when you addd satnav,better wheels, heated sunroof, upgraded stereo , you've added 50% to the price. But 1000 bux for RAM is a trick learned from Apple.
    And where does this leave NVidia going forwrd? All their BAse(models) belong to us.
  • IntelUser2000 - Friday, November 15, 2019 - link

    I'm suspicious of Anandtech's battery life tests for laptops.

    This is the only review where it doesn't regress and does significantly better than the Whiskey Lake generation.

    Icelake does really well on idle but on actual usage like web browsing it plummets. Perhaps its time for them to update their tests.
  • yeeeeman - Saturday, November 16, 2019 - link

    Phoronix review of the same laptops gets to the conclusion that ice lake is more power efficient than 14nm parts in both idle and heavy use.
  • timecop1818 - Saturday, November 16, 2019 - link

    Soldered in Killer Wireless garbage again. Not buying another Dell ever until they switch that shit out. I don't care if its made by Intel these days as long as they keep promoting that retarded branding shit in their non-gamer laptops, they will not have my money.
  • Reflex - Saturday, November 16, 2019 - link

    Literally just install the Intel AX200 drivers. Took me 5 mins and works great.
  • timecop1818 - Saturday, November 16, 2019 - link

    PCI IDs are different, no?
  • Reflex - Sunday, November 17, 2019 - link

    Doesn't seem to be. It will tell you its the incorrect driver but once you force it it will install, and future driver updates via Intel's updater will pick them up and install without a question.
  • timecop1818 - Sunday, November 17, 2019 - link

    Yeah, that means it's a different PCI ID. Because if they were included in the intel driver inf that would install without forcing. Intel would prolly be PCI\VEN_8086*DEV_xxxx and killer stuff probably uses their own vendor ID. Anyway, I'm happy with my current gemcut spectre x360 which has better screen and keyboard than XPS 13 anyway.
  • Reflex - Sunday, November 17, 2019 - link

    I hear its a good laptop, it was on my list until I noticed it didn't have a 32GB of RAM option. They both appeal to slightly different requirements IMO and I'd recommend either depending on the user and what they need.

Log in

Don't have an account? Sign up now