Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high-end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Professional Performance on Linux Gaming, Cont: GRID: Autosport & Shadow of Mordor
Comments Locked

205 Comments

View All Comments

  • JimmiG - Tuesday, May 31, 2016 - link

    What's worse than the price premium is that you're also paying for the previous generation architecture.

    I really don't see why anyone would want one of those CPUs. For gaming and most typical applications, the mainstream models are actually faster because of their more modern architecture and higher clock speeds. If you're a professional user, you should really be looking at Xeons rather than these server rejects.
  • K_Space - Tuesday, May 31, 2016 - link

    Exactly. I think that's the whole point: Intel realizes that -realistically- little profit will be made from these B-Es given the little incremental increase in performance so why not use them as an advert for the Xeons (which they have aggressively been marketing for HEDT not just servers over the last few month). Anyone considering these will consider the Xeons now.
  • Ratman6161 - Tuesday, May 31, 2016 - link

    There are a few benchmarks where they do make sense, if and only if you are doing that particular task for your job i.e. an environment where time is money. For the rest of us, if I need to do a video conversion of some kind its relatively rare and I can always start it before I go to bed.
  • retrospooty - Tuesday, May 31, 2016 - link

    People belittle AMD because even though Intel has dramatically slowed down the pursuit of speed, AMD still cant catch up. It's actually worse than that though. If AMD were competitive at all in the past decade Intel would still be perusing speed and would be further ahead. Its a double edged sword sort of thing.
  • Flunk - Tuesday, May 31, 2016 - link

    Yes, Intel has slowed down for AMD to catch up before. Cough, Pentium 4.
  • retrospooty - Tuesday, May 31, 2016 - link

    Yup... and back then AMD took advantage of it. I was the happy owner of a Thunderbird, then an Athlon, then an Athlon X2... Then Intel woke up and AMD went to sleep. For the past decade AMD has been too far behind to even matter. In the desktop CPU space there is Intel and then ... no-one.
  • Flunk - Tuesday, May 31, 2016 - link

    You're right, it's totally Intel's fault. They could launch a line of high-end consumer chips that cost the same as the current i5/i7 line but had 2-3X as many cores but no iGPU. They'd cost Intel the same to fabricate. They're the only ones to blame for their slowing sales.
  • khon - Tuesday, May 31, 2016 - link

    I could see people buying the i7-6850K for gaming, 6 cores at decent speeds + 40 PCI-E lanes, and $600 is not that bad when consider that some people have $700 1080's in SLI.

    However, the i7-6900/6950 look like they are for professional users only.
  • RussianSensation - Tuesday, May 31, 2016 - link

    40 PCI lanes are worthless when i7 6700K can reliably overclock to 4.7-4.8Ghz, and has extra PCIe 3.0 lanes off the chipset. The 6850K will be lucky to get 4.5Ghz, and still lose in 99% of gaming scenarios. Z170 PCIe lanes are sufficient for 1080 SLI and PCIe 3.0 x4 in RAID.

    6850K is the worst processor in the entire Broadwell-E line.
  • Impulses - Tuesday, May 31, 2016 - link

    Well if you're about gaming only you might as well compare it with the 6600K... AFAIK HT doesn't do much for gaming does it? The 6800K isn't much better either when your can just save a few bucks with the 5820K.

    I feel like they could've earned some goodwill despite the high end price hikes by just putting out a single 68xx SKU for like $500, it'd still be a relative price hike for entry into HEDT but could be more easily seen as a good value.

    Are the 6800K bad die harvests or something? Seems dumb to keep that artificial segmentation in place otherwise when HEDT is already pretty far removed from the mainstream platform.

    When I chose the 6700K over the 5820K I thought it'd be the last quad core I'd buy, but at this pace (price hikes, HEDT lagging further behind, lower end SKU still lane limited) I don't know if that'll be true.

Log in

Don't have an account? Sign up now