GRID Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low-end graphics we test at 1080p medium settings, whereas mid and high-end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560)GRID: Autosport on MSI GTX 770 Lightning 2GB ($245)GRID: Autosport on MSI R9 285 Gaming 2GB ($240)GRID: Autosport on ASUS R7 240 DDR3 2GB ($70)GRID: Autosport on Integrated Graphics

Gaming: Grand Theft Auto V Gaming: Shadow of Mordor
Comments Locked

70 Comments

View All Comments

  • Magichands8 - Tuesday, January 3, 2017 - link

    This Optane looks completely useless. But Optane DRAM sounds like it could be interesting. Depending upon how much slower it is.
  • lopri - Wednesday, January 4, 2017 - link

    I immediately thought of "AMD Memory" which AMD launched after the Bulldozer flop. But then again Intel have been going after this storage caching scheme for years now and I do not think it has taken them anywhere.
  • smilingcrow - Tuesday, January 3, 2017 - link

    Games frame rates to 2 decimal points adds nothing but making it harder to scan the numbers quickly. Enough already.
  • 1_rick - Tuesday, January 3, 2017 - link

    One place where the extra threads of an i7 are useful is if you're using VMs (maybe to run a database server or something.) I've found that an i5 with 8GB can get bogged down pretty drastically just from running a VM.
  • Meteor2 - Tuesday, January 3, 2017 - link

    I kinda think that if you're running db VMs on an i7, you're doing it wrong.
  • t.s - Wednesday, January 4, 2017 - link

    or if you're Android developer using Android Studio.
  • lopri - Wednesday, January 4, 2017 - link

    That is true but as others have implied you can get 6 or 8 real core Xeons for cheaper than these new Kaby Lake chips if VM is what you need the performance of a CPU is for.
  • ddhelmet - Tuesday, January 3, 2017 - link

    Well I am more glad now that I got a Skylake. Even if I waited for this the performance increase is not worth it.
  • Kaihekoa - Tuesday, January 3, 2017 - link

    Y'all need to revise your game benchmarking analysis. At least use some current generation GPUs, post the minimum framerates, and test at 1440p. The rest of your review is exceptional, but the gaming part needs some modernization, please.
  • lopri - Wednesday, January 4, 2017 - link

    I thank the author for a clear yet thorough review. A lot of grounds are covered and the big picture of the chip's performance and features is well communicated. I agree with the author's recommendation at the end as well. I have not felt that I am missing out anything compared to i7's while running a 2500K for my gaming system, and unless you know for certain that you can take advantage of HyperThreading, spending the difference in dollars toward an SSD or a graphics card is a wiser expenditure that will provide you with better computing experience.

    Having said that, I am wtill not compelled to upgrade my 2500K which has been running at 4.8 GHz for years. (It does 5.0 GHz no problem but I run it at 4.8 to leave some "headroom") While I think the 7600K is barely a worthy upgrade (finally!) on its own light, but the added cost cannot be overlooked. A new motherboard, new memory, and potentially a new heat sink will quickly add to the budget, and I am not sure if it is going to be worth all the expenses that will follow.

    Of course all that could be worthwhile if overclocking was fun, but Intel pretty much have killed overclocking and the overclocking community. Intentionally if I might add. Today overclocking does not give one a sense of discovery or accomplishment. Competition between friendly enthusiasts or hostile motherboard/memory vendors has disappeared. Naturally there is no accumulation or exchange of knowledge in the community, and conversations have become frustrating and vain due to lack of overclocking expertise. Only some brute force overclocking with dedicated cooling has some following, and the occasional "overclocking" topics in the forums are really a braggadocio in disguise, of which the competition underneath is really about who spent the most on their rigs with the latest blingy stuff. Needless to say those are not as exciting or illuminating as the real overclocking of the yore, and in my opinion there are better ways to spend money for such a self-gratification without the complication that often accompanies overclocking which in the end fails to impress.

    Intel might have a second thought about its overclocking policies now, but just as many things Intel have done in recent years, it is too little too late. And their chips have no headroom anyway. My work system is due for an upgrade and I am probably going to pick up a couple of E2670s which will give me 16 real cores for less than $200. Why bother with the new stuff when the IPC gain is meager and there is no fun in overclocking? And contribute to Intel's revenue? Thank you but no thank you.

    P.S. Sorry I meant to commend the author for the excellent (albeit redundant) review but ended up ranting about something else. Oh well, carry on..

Log in

Don't have an account? Sign up now