Intel Iris Pro 5200 Graphics Review: Core i7-4950HQ Tested
by Anand Lal Shimpi on June 1, 2013 10:01 AM EST3DMarks & GFXBenchmark
We don't use 3DMark to draw GPU performance conclusions but it does make for a great optimization target. Given what we've seen thus far, and Intel's relative inexperience in the performance GPU space, I wondered if Iris Pro might perform any differently here than in the games we tested.
It turns out, Iris Pro does incredibly well in all of the 3DMarks. Ranging from tying the GT 650M to outperforming it. Obviously none of this has any real world impact, but it is very interesting. Is Intel's performance here the result of all of these benchmarks being lighter on Intel's weaknesses, or is this an indication of what's possible with more driver optimization?
I also included GFXBenchmark 2.7 (formerly GL/DXBenchmark) as another datapoint for measuring the impact of MSAA on Iris Pro:
Iris Pro goes from performance competitive with the GT 650M to nearly half its speed once you enable 4X MSAA. Given the level of performance Iris Pro offers, I don't see many situations where AA will be enabled, but it's clear that this is a weak point of the microarchitecture.
177 Comments
View All Comments
beginner99 - Saturday, June 1, 2013 - link
Impressive...if you ignore the pricing.tipoo - Sunday, June 2, 2013 - link
?velatra - Saturday, June 1, 2013 - link
On page 4 of the article there 's a word "presantive" which should probably be "representative."jabber - Saturday, June 1, 2013 - link
May I ask why The Sims is never featured in your reviews on such GPU setups?Why? Well in my line of business, fixing and servicing lots of laptops with the integrated chips the one game group that crops up over and over again is The Sims!
Never had a laptop in from the real world that had any of the games you benchmarked here. But lots of them get The Sims played on them.
JDG1980 - Saturday, June 1, 2013 - link
Agreed. The benchmark list is curiously disconnected from what these kind of systems are actually used to do in the real world. Seldom does anyone use a laptop of any kind to play "Triple-A" hardcore games. Usually it's stuff like The Sims and WoW. I think those should be included as benchmarks for integrated graphics, laptop chipsets, and low-end HTPC-focused graphics cards.tipoo - Saturday, June 1, 2013 - link
Because the Sims is much easier to run than most of these. Just because people tried running it on GMA graphics and wondered why it didn't work doesn't mean it's a demanding workload.jabber - Saturday, June 1, 2013 - link
Yes but the point is the games tested are pretty much pointless. How many here would bother to play them on such equipped laptops?Pretty much none.
But plenty 'normal' folks who would buy such equipment will play plenty of lesser games. In my job looking after 'normal' folks thats quite important when parents ask me about buying a laptop for their kid that wants to play a few games on it.
The world and sites such as Anandtech shouldnt just revolve around the whims of 'gamer dudes' especially as it appears the IT world is generally moving away from gamers.
It's a general computing world in future, rather than a enthusiast computing world like it was 10 years ago. I think some folks need to re-align their expectations going forward.
tipoo - Sunday, June 2, 2013 - link
I mean, if it can run something like Infinite or even Crysis 3 fairly well, you can assume it would run the Sims well.Quizzical - Saturday, June 1, 2013 - link
It would help immensely if you would say what you were comparing it to. As you are surely aware, a system that includes an A10-5800K but cripples it by leaving a memory channel vacant and running the other at 1333 MHz won't perform at all similarly to a properly built system with the same A10-5800K with two 4 GB modules of 1866 MHz DDR3 in properly matched channels.That should be an easy fix by adding a few sentences to page 5, but without it, the numbers don't mean much, as you're basically considering Intel graphics in isolation without a meaningful AMD comparison.
Quizzical - Saturday, June 1, 2013 - link
Ah, it looks like the memory clock speeds have been added. Thanks for that.