Gaming Benchmarks: High End

On this page are our 2015 high-end results with the top models at their respective release dates – the GTX 980 and R9 290X. Results for the R7 240, GTX 770 and R9 285 can be found in our benchmark database, Bench.

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

 

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380)

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560)

Middle-Earth: Shadow of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadow of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Core i3-6100TE Professional Performance: Linux Base Clock Overclocking the Core i3-6100TE: Scaling
Comments Locked

62 Comments

View All Comments

  • ViperV990 - Thursday, March 17, 2016 - link

    The i5-6400 @ $180 seems to be a much better part to OC.
  • nightbringer57 - Thursday, March 17, 2016 - link

    Heh, when some of the younger ones today speak about overclocking, I like to remember them of how much more financially interesting overclocking used to be. It's like everyone forget how overclocking worked a few years ago. I still remember my cheap student gaming PC with a Pentium E2180 that went from 2GHz to 3GHz with a standard tower rad and only a slight voltage boost. Then you could have almost all of the performance of the 300€ CPUs (except a good bit of the cache) for 60€ or so. Multiplier overclocking is easier, yes, and it's good to reach insane peak frequencies - but this market of the "buy low, push high" overclocking has faded out (courtesy, of course, of the segmentation by core numbers as well)
  • BrokenCrayons - Thursday, March 17, 2016 - link

    Oh yeah, well I overclocked when there were still turbo buttons on the fronts of AT cases! So nyah nyah!

    Sarcasm aside though, drawing a line in the sand to mark when overclocking was "good" or "worthwhile" and when it stopped being fun or have any sort of point would result in an awful lot of people drawing an awful lot of lines all over the place. For instance, the last processor I bothered with overclocking was a 2GHz Pentium 4 derived Celeron. Pushing the FSB from 100 to 150MHz on an Intel boxed cooler with a little bit of extra voltage netted a 3GHz chip...which rapidly became highly unstable over the course of a few months. After that and numerous PIIs, PIIIs, the infamous Celeron 300A and whatnot, I got bored with it and my priorities shifted. I would have overclocked my VIC-20 and Trash 80 if I'd known more about computers because I couldn't resist tinkering. I think if one were to ask other people, they'd find different points in time and different processor technologies so it's probably unfair to people who are simply by nature of the date of their birth, unable to discuss overclocking in terms you're more comfortable with.
  • nightbringer57 - Thursday, March 17, 2016 - link

    Yes, but still. There had been a more or less constant trend of tinkering around with low-end CPUs to get quasi-high-end performance out of them for quite a long time. I quote my old E2180, but over the "modern" history of computers (that is, in the current context, IBM PC and their heir), there had always been such shenanigans available to the tinkerers. If you go further in time, the trend fades as the modern concept of CPU "range" fades out and it came more down to boosting your X - generation CPU to still have a bit more oomph after most of the software environment of you given platform had moved to a new generation.
    And not only Intel processors, but AMD processors as well, with the pencil unlockable Durons and whatnot.

    As this article states, this kind of overclocking has more or less died in recent years, partly due to technical issues (as systems get more and more complex and integrated, it becomes riskier), partly due to the current state of the market, partly due to marketing practices.

    It's not about discussing overclocking in terms I personally am comfortable with or whatnot. It's just about being realistic. I hope that AMD can come back with Zen and bring a bit more freshness into the low-end overclocking market.
  • Spoelie - Friday, March 18, 2016 - link

    Still had a lot of fun in the period between 2000-2010 with the Athlons, always buying the lowest end SKU of the performance line, and ocing between 20-40% to reach the same performance of the highest end SKU in the line.

    E.g.
    On an nForce2 board IIRC
    * Athlon XP 1800+ (Socket A Thoroughbred 256KB cache) 1533mhz OC to ~2ghz
    * Athlon XP 2500+ (Socket A Barton 512KB cache) FSB166 to FSB200 = OC to "3200+"

    Had a Athlon 64 2800+ on a Socket 754 for a very short time, don't remember what I did to it.

    Then a "DFI LanParty UT NF4 Ultra-D" (Socket 939 w/ nForce4 & 2*512MB Winbond BH-5 PC3200 @ 250mhz 2-2-2), cream of the crop at the time.
    * Athlon 64 3000+ (Venice) OC 1800 to 2250 (250bus)
    * Opteron 165 (Toledo) OC 1800 to 2475 (274bus)

    I loved those days
  • Murloc - Sunday, March 20, 2016 - link

    yeah I remember a 45nm core 2 duo I had, with the boxed stock cooler I was able to lower the voltage quite a bit and daily OC it at 4GHz at the same time.
    It was a lucky piece compared to others.
  • cobrax5 - Monday, March 21, 2016 - link

    I'm thinking about replacing my 45nm i7-930 @ 3.8ghz with a hex-core, 32nm Xeon and OC that to > 3.6ghz. You can get them for like under $200, and I'll keep my (admittedly aging) X58 platform.
  • benedict - Thursday, March 17, 2016 - link

    Single-threaded benchmarks show this processor to be much better than what it'd be in real life. I don't know if there are people who only run a single program at a time on their PCs. Having more cores is much more valuable than most benchmarks will show.
  • TheinsanegamerN - Thursday, March 17, 2016 - link

    I can run 7 programs at once, but if one is very demanding and is single threaded, then single threaded performance is still quite relevant. Multiple programs/=/not needing single threaded performance. Thinking that single threaded performance is not important got AMD the FX series, and subsequently a large portion of their users jumping to intel.
  • calculagator - Thursday, March 17, 2016 - link

    Everyone is different, but single threaded benchmarks give a much better picture of performance for "normal" users than multithreaded in my experience. Even if they have lots of programs running, most users are only using one program at a time. All of those open documents and web tabs use very little CPU power while they just sit there. I have about 100 active processes right now, but my CPU is idling at about 3% usage.
    Even a basic dual-core CPU can handle most users' multitasking. The most common exceptions are gaming and video editing, but most users are not doing those things most of the time. Consider how people use laptops so often: their CPUs have such high single-threaded/burst performance that they hardly notice how much less powerful they are than much more powerful desktop CPUs.

Log in

Don't have an account? Sign up now