Gaming Benchmarks

A side note for our benchmarks: we wanted to test four-way GPU comparisons, but unfortunately I am waiting on a replacement for one of my GPUs.  Apparently inserting and reinserting them 2000+ times over two years is not a normal usage scenario…!

Metro2033

Our first analysis is with the perennial reviewers’ favorite, Metro2033.  It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode.  Metro2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings.  Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings.  Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

Metro 2033 - One 7970, 1440p, Max Settings

Metro 2033 1 GPU 2 GPU 3 GPU
AMD
NVIDIA  

Dirt 3

Dirt 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters.  Dirt 3 also falls under the list of ‘games with a handy benchmark mode’.  In previous testing, Dirt 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything.  The small issue with Dirt 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed.  Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033.  This in essence should make the benchmark more variable, but we take repeated in order to smooth this out.  Using the benchmark mode, Dirt 3 is run at 1440p with Ultra graphical settings.  Results are reported as the average frame rate across four runs.

Dirt 3 - One 7970, 1440p, Max Settings

Dirt 3 1 GPU 2 GPU 3 GPU
AMD
NVIDIA  

Civilization V

A game that has plagued my testing over the past twelve months is Civilization V.  Being on the older 12.3 Catalyst drivers were somewhat of a nightmare, giving no scaling, and as a result I dropped it from my test suite after only a couple of reviews.  With the later drivers used for this review, the situation has improved but only slightly, as you will see below.  Civilization V seems to run into a scaling bottleneck very early on, and any additional GPU allocation only causes worse performance.

Our Civilization V testing uses Ryan’s GPU benchmark test all wrapped up in a neat batch file.  We test at 1080p, and report the average frame rate of a 5 minute test.

Civilization V - One 7970, 1440p, Max Settings

Civilization V 1 GPU 2 GPU 3 GPU
AMD
NVIDIA  

Sleeping Dogs

While not necessarily a game on everybody’s lips, Sleeping Dogs is a strenuous game with a pretty hardcore benchmark that scales well with additional GPU power due to its SSAA implementation.  The team over at Adrenaline.com.br is supreme for making an easy to use benchmark GUI, allowing a numpty like me to charge ahead with a set of four 1440p runs with maximum graphical settings.

Sleeping Dogs - One 7970, 1440p, Max Settings

Sleeping Dogs 1 GPU 2 GPU 3 GPU
AMD
NVIDIA  

GPU Conclusions

It is clear that the x8/x4/x4 PCIe lane allocation from the OC Formula is preferable over the x8/x8 + x4 scenarios when using three AMD GPUs, although I am waiting to see how a PLX enabled motherboard performs in this scenario as well.

Computation Benchmarks ASRock Z87 OC Formula/AC Conclusion
Comments Locked

22 Comments

View All Comments

  • IanCutress - Saturday, July 27, 2013 - link

    Out of my three 4770K CPUs I have had in, one fails to do 4.2 GHz at 1.4 volts, and the other two will do 4.6 GHz fairly easily and stable, but require a big voltage push for 4.7 GHz. Out of these two chips however, one was 6C cooler at 4.6 GHz OCCT load. Guess what - I'd kill that 4.2 GHz CPU before it hit 4.6 GHz. It's part of what is called 'the silicon lottery'.

    Please post your experiences of auto overclocking vs. manual on your CPUs, so we can provide a reference. What seems to have skipped over heads is that manufacturers are people too, and thus can design automatic overclock settings that are aggressive. ASRock are clearly being aggressive enough with automatic settings for my chips (luckily), and the wide variation in Haswell samples (4.2 GHz to 5.0 GHz air stable) makes it hard to compare different motherboards in terms of 24/7 overclocking - as these OC boards are built for sub-zero, anything air and water is essentially a stroll in the park.

    If you have any suggestions rather than blanket statements, I would be amenable to listen to them.

    Ian
  • ikjadoon - Sunday, December 1, 2013 - link

    Hi, Ian! Sorry to bump old thread, but someone just told me to never trust Anandtech regarding OCs, so I was naturally curious about this mixup.

    Regarding your 4.2GHz @ 1.40V chip, what was your VRIN? Intel themselves recommended to keep it 0.4V above Vcore, so at least 1.80V in this case (as reported by one of Gigabyte's BIOS QA testers). Source: http://www.overclock.net/t/1401976/the-gigabyte-z8...

    The same question applies to this review: http://www.anandtech.com/show/7175/asrock-z87m-oc-...

    The VRIN was at 1.650V which is too low for the 1.350V Vcore. Were you able to try a higher VRIN in the review of the mATX board?
  • ikjadoon - Sunday, December 1, 2013 - link

    EDIT: sorry, got them confused. That question applies to THIS review, where your VRIN is too low. I have no idea what it was on that mATX board, though, but am definitely curious.
  • Aikouka - Friday, July 26, 2013 - link

    I thought that I would add that even my cheap Z87 Extreme4 has an on-board USB port, so it's not really something fancy that was added to this board. =P

    Also, I'm rather disappointed that if they were going to make this available for water cooling that they went with built-in barbs. That's just lazy. ASUS seems to have it right with the Maximus VI Formula as it uses G1/4 threads. Unfortunately, it's not out yet, but it's supposed to release in a week or two (early August).
  • IanCutress - Saturday, July 27, 2013 - link

    ASRock seem to have latched onto it as a useful feature. It is certainly a plus, I wonder what the uptake % is. My father just informed me that his version of Cubase still uses a USB verification dongle.
  • This Guy - Saturday, July 27, 2013 - link

    I haven't read a review anywhere with the HDMI-In actually working with discrete GPU's. For me it just makes my screen flash black.

    The Molex GPU power connector is positioned poorly. It's pointed flat against where most power supplys go. Granted, when used as an open air overclocking board this won't be a problem, but Asrock use this connector on many of their high end Z87 boards.
  • ThortonBe - Saturday, July 27, 2013 - link

    Under the feature overview sections, I believe it should read "Purity Sound" as opposed to "Purity Audio".

    The LCD screen is a neat addition. I wonder how much it raised the B.O.M.
  • Gigaplex - Monday, July 29, 2013 - link

    "I rather like the ASRock BIOS"

    I hate it. My Linux server runs an ASRock board (A75M-HVS) booting via UEFI, but they released a newer firmware that breaks Linux UEFI booting. I emailed their tech support, and instead of getting an email back stating that Linux is unsupported, they just ignored me and I never heard from them. I had to roll back to the older firmware.
  • Montago - Monday, July 29, 2013 - link

    Why didn't Asus use THIS colorscheme ???... WHYYYYY

    Black & Yellow is awesome...
  • QChronoD - Monday, July 29, 2013 - link

    Curious about the HDMI in and how useful it really is. Did you really not have a single other device in your house with HDMI output? Does it work with dual monitors? Can it only change one of the screens and not screw up your desktop?
    Is anyone else offering HDMI also? for less than $200+?

Log in

Don't have an account? Sign up now