Image Quality

Software compatibility and image quality remain understandable concerns, however Intel has improved tremendously in these areas over the past couple of years. I couldn't run Total War: Shogun 2 on Iris Pro, but other than that every other game I threw at the system ran without errors - a significant improvement over where things were not too long ago. On the compute side, I couldn't get our Folding@Home benchmark to work but otherwise everything else ran well.

On the image quality front I didn't see too much to be concerned about. I noticed some occasional texture flashing in Battlefield 3, but it was never something I was able to grab a screenshot of quickly enough. Intel seems pretty quick about addressing any issues that crop up and as a company it has considerably increased staffing/resources on the driver validation front.

The gallery below has a series of images taken from some of the benchmarks in our suite. I didn't notice any obvious differences between Intel and NVIDIA render quality. By virtue of experience and focus I expect software compatiblity, image quality and driver/hardware efficiency to be better on the NVIDIA side of the fence. At the same time, I have no reason to believe that Intel isn't serious about continuing to address those areas going forward. Intel as a company has gone from begging software developers to at least let their code run on Intel integrated graphics, to actively working with game developers to introduce new features and rendering techniques.

GRID 2 Synthetics
Comments Locked

177 Comments

View All Comments

  • beginner99 - Saturday, June 1, 2013 - link

    Impressive...if you ignore the pricing.
  • tipoo - Sunday, June 2, 2013 - link

    ?
  • velatra - Saturday, June 1, 2013 - link

    On page 4 of the article there 's a word "presantive" which should probably be "representative."
  • jabber - Saturday, June 1, 2013 - link

    May I ask why The Sims is never featured in your reviews on such GPU setups?

    Why? Well in my line of business, fixing and servicing lots of laptops with the integrated chips the one game group that crops up over and over again is The Sims!

    Never had a laptop in from the real world that had any of the games you benchmarked here. But lots of them get The Sims played on them.
  • JDG1980 - Saturday, June 1, 2013 - link

    Agreed. The benchmark list is curiously disconnected from what these kind of systems are actually used to do in the real world. Seldom does anyone use a laptop of any kind to play "Triple-A" hardcore games. Usually it's stuff like The Sims and WoW. I think those should be included as benchmarks for integrated graphics, laptop chipsets, and low-end HTPC-focused graphics cards.
  • tipoo - Saturday, June 1, 2013 - link

    Because the Sims is much easier to run than most of these. Just because people tried running it on GMA graphics and wondered why it didn't work doesn't mean it's a demanding workload.
  • jabber - Saturday, June 1, 2013 - link

    Yes but the point is the games tested are pretty much pointless. How many here would bother to play them on such equipped laptops?

    Pretty much none.

    But plenty 'normal' folks who would buy such equipment will play plenty of lesser games. In my job looking after 'normal' folks thats quite important when parents ask me about buying a laptop for their kid that wants to play a few games on it.

    The world and sites such as Anandtech shouldnt just revolve around the whims of 'gamer dudes' especially as it appears the IT world is generally moving away from gamers.

    It's a general computing world in future, rather than a enthusiast computing world like it was 10 years ago. I think some folks need to re-align their expectations going forward.
  • tipoo - Sunday, June 2, 2013 - link

    I mean, if it can run something like Infinite or even Crysis 3 fairly well, you can assume it would run the Sims well.
  • Quizzical - Saturday, June 1, 2013 - link

    It would help immensely if you would say what you were comparing it to. As you are surely aware, a system that includes an A10-5800K but cripples it by leaving a memory channel vacant and running the other at 1333 MHz won't perform at all similarly to a properly built system with the same A10-5800K with two 4 GB modules of 1866 MHz DDR3 in properly matched channels.

    That should be an easy fix by adding a few sentences to page 5, but without it, the numbers don't mean much, as you're basically considering Intel graphics in isolation without a meaningful AMD comparison.
  • Quizzical - Saturday, June 1, 2013 - link

    Ah, it looks like the memory clock speeds have been added. Thanks for that.

Log in

Don't have an account? Sign up now