Despite the fact that the MW31-SP0 is not necessarily a gaming-focused system, the use of a PLX switch suggests that GPU intercommunication is going to be a big part of the appeal to this motherboard. This can either be through gaming or compute tasks using PCIe co-processors, or if the system is paired with a few network cards to act as a switch/router or firewall when required. As a way of testing the PCIe slots, we put the board through our regular GPU tests.

Gaming Performance 2015

Our 2015 gaming results are still relatively new, but the issue of FCLK settings might play a big role here. At launch, the default setting for the communication buffer between the CPU and PCIe stack was 800 MHz, even though Intel suggested 1000 MHz, but this was because of firmware limitations from Intel. Since then, there is firmware to enable 1000 MHz, and most motherboard manufacturers have this - but it is unclear if the motherboard will default to 1000 MHz and it might vary from BIOS version to BIOS version. As we test at default settings, our numbers are only ever snapshots in time, but it leads to some interesting differences in discrete GPU performance.

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low-end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien: Isolation on NVIDIA GTX 770 2GB ($245)

Alien: Isolation on NVIDIA GTX 980 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main storyline lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low-end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on AMD R7 240 DDR3 2GB ($70)

Total War: Attila on NVIDIA GTX 770 2GB ($245)

Total War: Attila on NVIDIA GTX 980 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low-end systems we test at 720p on the lowest settings, whereas mid and high-end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto on AMD R7 240 DDR3 2GB ($70)

Grand Theft Auto on NVIDIA GTX 770 2GB ($245)

Grand Theft Auto on NVIDIA GTX 980 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low-end graphics we test at 1080p medium settings, whereas mid and high-end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

Average Frame Rate / Minimum Frame Rates
Click through for full-sized images

GRID: Autosport on AMD R7 240 DDR3 2GB ($70) GRID: Autosport on AMD R7 240 DDR3 2GB ($70)

GRID: Autosport on NVIDIA GTX 770 2GB ($245) GRID: Autosport on NVIDIA GTX 770 2GB ($245)

GRID: Autosport on NVIDIA GTX 980 4GB ($560) GRID: Autosport on NVIDIA GTX 980 4GB ($560)

Middle-Earth: Shadow of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadow of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low-end graphics we examine at 720p with low settings, whereas mid and high-end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings.

Shadow of Mordor on AMD R7 240 DDR3 2GB ($70)

Shadow of Mordor on NVIDIA GTX 770 2GB ($245)

Shadow of Mordor on NVIDIA GTX 770 2GB ($245)

Shadow of Mordor on NVIDIA GTX 980 4GB ($560)

Shadow of Mordor on NVIDIA GTX 980 4GB ($560)

CPU Performance, Short Form GIGABYTE Server MW31-SP0 Conclusion
Comments Locked

28 Comments

View All Comments

  • Samus - Friday, April 22, 2016 - link

    Ahh DFI, they sealed their fate in the consumer market with the Lanparty line, probably the most unreliable mainstream motherboards ever. It's hard to believe the same company made the legendary Infinity motherboards. Hopefully Supermicro doesn't do the same as they begin to enter the mainstream desktop board market...
  • jabber - Friday, April 22, 2016 - link

    Hmmm not how I remember it at the time. Basically if you didn't have a 939 DFI SLi board in your system you just weren't serious. My 939 Lanparty board gave superb service for several years with a Opteron 180, 2GB DDR500, 2 x 7900GTX and a Raptor. Sure they were quirky to configure to get the best out of them but I don't remember them being unreliable, even as a Lanparty Forum member. Still it was 10 years ago.
  • Samus - Friday, April 22, 2016 - link

    Depends which board you had. If you remember, the short lived nForce chipset through its 430 and 460 incarnations gave DFI a bad rap for reliability. The boards worked fine until they didn't. Most of the reviews were great initially until 2-3 years went by and the boards all started failing. It wasn't a DFI issue but DFI no doubt made more nForce boards than anybody for overclocking and the long term reliability of those chipset was terrible.

    The nForce4 goes down in history as one of the highest performing yet most unstable platforms ever. God forbid you actually loaded the nVidia disk controller drivers instead of using the Microsoft default. Except perhaps the Intel 815e and VIA southbridges, those were pretty terrible too.

    Some chipsets get a bad rap for no reason though, even recalled ones like the Intel P67. I still have the early non-Ivy Bridge compatible version on an Intel board no less and it's been stable in my HTPC for 5 years running daily...it must have 50,000 hours on it by now.
  • DanNeely - Friday, April 22, 2016 - link

    Everyone's NF4 boards died like crazy though. That doesn't explain why some makers like DFI were destroyed by nVidia's fiasco, while the ones who're still around managed to avoid any reputational damage from it.
  • jabber - Saturday, April 23, 2016 - link

    I never had an issue with the nVidia SATA drivers. The only thing that screwed the build was the nVidia Network monitoring/security drivers. I used to help buddies with poor NF4 setups by asking first if they installed that. Yes they had. You only ever tried installing that crap once.
  • danjw - Thursday, April 21, 2016 - link

    In the chipset chart you have: "Supports Intel Xeon E5-1200 v5 CPUs" that should be "E3-1200 v5".
  • sivaplus - Friday, April 22, 2016 - link

    Indeed. Just because I saw E5-12... I knew there is something fishy here :). Still E5-14xx support would have been nice at this price point
  • Lord 666 - Thursday, April 21, 2016 - link

    So frustrating to see the ram maxed out at 64gb. The x99 platform maxes out at 128gb. Hoping a x99 successor arrives supporting skylake CPUs
  • kgardas - Thursday, April 21, 2016 - link

    Indeed, but this is Intel marketing. For more RAM on Xeon you need to go to D/E5 territory...
  • SFNR1 - Friday, April 22, 2016 - link

    Before the v5, there was a limit of 32GB so 64GB is a huge improvement here (finally), but yes, it's a pain. >ou would have to buy a Xeon E5 1x http://ark.intel.com/compare/82767,82766,82765,827... . That pricebump is insane.

Log in

Don't have an account? Sign up now