GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. Both the average and minimum frame rates are recorded.

For this test we used the following settings with our graphics cards:

GRID: Autosport Settings
  Resolution Quality
Low GPU Integrated Graphics 1920x1080 Medium
ASUS R7 240 1GB DDR3
Medium GPU MSI GTX 770 Lightning 2GB 1920x1080 Maximum
MSI R9 285 Gaming 2G
High GPU ASUS GTX 980 Strix 4GB 1920x1080 Maximum
MSI R9 290X Gaming 4G

Integrated Graphics

GRID: Autosport on Integrated Graphics GRID: Autosport on Integrated Graphics [Minimum FPS]

The difference between the APUs and Intel CPUs again shows up to a 33-50% difference in frame rates, to the point where at 1080p medium the integrated graphics do not break the minimum 30 FPS barrier. The GPU frequency and L3 cache again shows up the i3-6100 compared to the i3-6300.

Discrete Graphics

GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) GRID: Autosport on ASUS R7 240 DDR3 2GB ($70) [Minimum FPS]

GRID: Autosport on MSI R9 285 Gaming 2GB ($240) GRID: Autosport on MSI R9 285 Gaming 2GB ($240) [Minimum FPS]

GRID: Autosport on MSI GTX 770 Lightning 2GB ($245) GRID: Autosport on MSI GTX 770 Lightning 2GB ($245) [Minimum FPS]

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380) GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560) GRID: Autosport on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]

With the discrete GPUs, there are multiple avenues to take with this analysis.

On the low-end cards, the choice of CPU makes little difference in our tests.

On the mid-range and high-end cards, the power of the CPU makes more of an effect with AMD discrete cards than NVIDIA discrete cards, except with the AMD Athlon X4 845 in play. When using an AMD discrete card with a mid-range GPU, the X4 845 plays well enough with the i3 parts for its price, but falls away a bit more on the high-end AMD discrete GPU. With NVIDIA GPUs, the Athlon X4 845 sits at the bottom and the main challengers are the FX parts.

So for EGO engine rules, it would seem to be:

AMD Carrizo CPU + AMD discrete GPU is OK, the lower powered the GPU the better.
AMD FX CPU + NVIDIA discrete GPU is OK
Intel CPU + any discrete GPU works well.

One could attest the differences between the discrete GPU choices to driver implementation, IPC, or how each GPU company focuses in optimizing for each game at hand (frequency vs threads vs caches).

Gaming Comparison: Grand Theft Auto Gaming Comparison: Shadow of Mordor
Comments Locked

94 Comments

View All Comments

  • nightbringer57 - Monday, August 8, 2016 - link

    Hard question.

    My guess would be that such models are core i3s with defective iGPUs, and overall lower binned, mostly destined to OEMs that could negotiate a lower price for almost identical performance (3% less frequency = no noticeable difference), in models with typically low-end dGPUs. While at the same time not price dumping the other i3s in the retail market (prices are always much more variable than the MSRP in the retail market and I would guess you could find them for slightly cheaper).

    Once again, 3% frequency and 3W TDP don't make for much of a difference.
  • DanNeely - Monday, August 8, 2016 - link

    Yeah it definitely looks like a binning dumpster - trying to salvage the last bit of value from chips with working HT but a damaged GPU that needed partially fused off. If the list price was marginally lower I wouldn't've thought anything of it, although I suppose Intel could be willing to offer better volume discounts behind the scenes.
  • extide - Monday, August 8, 2016 - link

    Yup, the 6098P has GT1 graphics, with only 12 EU's, vs GT2 and 24 EU's in all of the other i3's. I bet they are harvesting chips with bad EU's. As far as price goes, I am sure that whatever OEM is buying those is paying less than the prices on ARK. Intel is kinda famous for having tons of CPU's all the same price, but the OEM's buying them are going ot be paying totally different prices than whats on the price sheets/ARK. I would imagine the prices that they negotiate end up being lower for the lower models and higher for the higher models even if they are all listed the same on ARK.
  • Ratman6161 - Monday, August 8, 2016 - link

    I did a quick check and did not find any 6098's for sale on New Egg or Amazon. But I could see a position for them if the street price is less than a 6100. For anyone who is not going to use the integrated graphics anyway, saving a few more bucks on the CPU could be worthwhile. Has to be cheaper than a 6100 though because otherwise you would just get the 6100.

    Since I'm not finding any for sale, I'm also wondering if they will mainly be sold to OEM's and end up with people who wouldn't know the difference anyway in their low end Dell or HP desktop?
  • kuntakinte - Monday, August 8, 2016 - link

    Nice selective test :-). In comparision with rather old i3-4330 (3,5GHz) Skylake shines.
    But maybe you can add to the charts fastest i3 Haswell (i3-4370, 3,8 GHz). It's exactly in the middle of the tested three cpu's. But then i supose that Skylake "advantage" will drop to mere 2-5%.
  • lefty2 - Monday, August 8, 2016 - link

    Actually, I was surprised that the iGPU sees zero improvement since Haswell.
  • ImSpartacus - Monday, August 8, 2016 - link

    This is an awesome subject that I've fascinated by. Good to see a proper review.
  • AndrewJacksonZA - Monday, August 8, 2016 - link

    Interesting that you kept the WinRAR test and let the 7-Zip test go to the "Legacy" section. Why? Did you do a coin toss between the two? :-)
  • stephenbrooks - Monday, August 8, 2016 - link

    Right... a friend actually persuaded me to migrate *from* WinRAR *to* 7-Zip because it offered better compression.
  • DanNeely - Monday, August 8, 2016 - link

    As a file compression utility, 7zip is better than WinRar. Where Winrar stands out is as one of the very few real world applications whose performance is hugely dependent on memory speed; which makes it a great benchmark.

Log in

Don't have an account? Sign up now