Discrete GPU Gaming

When comparing CPUs to APUs, one strength shown by team Blue in the past is the discrete GPU performance. However even when using dual graphics cards at a 1920x1080p resolution, we seem to have hit a wall where extra CPU performance does not necessarily translate to more frames per second. Our results below show little difference between the Haswell processors, and we need to go down to a 2.0 GHz i7 or a 3.5 GHz i3 CPU to see a significant drop in frame rates. The biggest benefit from overclocking seems to be F1 2013 minimum frame rates.

F1 2013

Discrete SLI, Average FPS, F1 2013

Discrete SLI, Minimum FPS, F1 2013

Bioshock Infinite

Discrete SLI, Average FPS, Bioshock Infinite

Discrete SLI, Minimum FPS, Bioshock Infinite

Tomb Raider

Discrete SLI, Average FPS, Tomb Raider

Discrete SLI, Minimum FPS, Tomb Raider

Sleeping Dogs

Discrete SLI, Average FPS, Sleeping Dogs

Discrete SLI, Minimum FPS, Sleeping Dogs

Company of Heroes 2

Discrete SLI, Average FPS, Company of Heroes 2

Discrete SLI, Minimum FPS, Company of Heroes 2

Battlefield 4

Discrete SLI, Average FPS, Battlefield 4

Discrete SLI, Minimum FPS, Battlefield 4

Gaming and Synthetics on Processor Graphics Conclusions
POST A COMMENT

103 Comments

View All Comments

  • vastac13 - Friday, July 11, 2014 - link

    Just when I'm about to go to bed... Good thing I checked 1 more time :D Happy reading folks! Reply
  • Iketh - Friday, July 11, 2014 - link

    I just hope Intel has another massive breakthrough in performance soon because rendering a 4K mp4 is gonna SUCK

    Don't need AMD for this case, the industry will drive it this time...
    Reply
  • ddriver - Friday, July 11, 2014 - link

    Looks like desktop CPU performance is hitting a brick wall, the last 4 generations are barely incremental in performance. Could very well be the product of AMD ultimately failing to compete in the high end. I don't complain, this way I don't feel the urge to upgrade my 3770k. Reply
  • Antronman - Thursday, July 17, 2014 - link

    IBM says silicon will end up limiting CPU capability, and are investigating alternatives. Reply
  • wrkingclass_hero - Sunday, July 13, 2014 - link

    It's funny, I'm actually rendering a 4K mp4 right now! My 4.6 GHz 3930k looks like it's going to take a grand total of 21 hours to render it... and it's a 3 minute clip...
    But there are a lot of effects on it (stabilization, layers of videos, etc.) It also doesn't help that it is a two pass.
    Reply
  • ddriver - Monday, July 14, 2014 - link

    What do you expect, throwing such a workload on a measly single cpu - such tasks are performed by rendering farms with thousands of quad socket machines. Reply
  • Braincruser - Sunday, July 13, 2014 - link

    Rendering should be left to GPU cores in shaders. They scale much much better than cpu. Reply
  • Mark-Benney - Thursday, August 14, 2014 - link

    I wrote out a the Manderoot program on a Acorn Electron 32k, Back in the very early days. Took 48hrs to complete up on books with fan placed under it. Lol bet you were not even born when i first wrote program in Dos/Machine code/Pascal/BBC Basic. And still in nappys when I was overclocking a Intel Celron from 233mhz to a stable 24/7 367mhz Reply
  • CrystalBay - Monday, July 14, 2014 - link

    Thanks Dr. Ian I love my Intel 4790K @ 4.8 Ghz , I also love Asus Z97 Deluxe . This isa the simplest way to OC in my 25 years of building PC's ... Screw it being AVX stable capitol BS never will be used instruction . Any modern chip fails at it anyway ... Go Devils ,go AMD Reply
  • superjim - Tuesday, July 15, 2014 - link

    Intel hasn't made a good OCing chip since Sandy Bridge. Devil's Canyon just reinforces how good SB was. Nearly every i5 and i7 chip could hit 4.4 without issue with most at 4.6+ on a good air cooler. Raise your hand if you're still on SB only because there is nothing better... Reply

Log in

Don't have an account? Sign up now