Benchmarking Testbed Setup

Our hardware has been modified for deep learning workloads with a larger SSD and more RAM.

CPU: Intel Core i7-7820X @ 4.3GHz
Motherboard: Gigabyte X299 AORUS Gaming 7
Power Supply: Corsair AX860i
Hard Disk: Intel 1.1TB
Memory: G.Skill TridentZ RGB DDR4-3200 4 x 16GB (15-15-15-35)
Case: NZXT Phantom 630 Windowed Edition
Monitor: LG 27UD68P-B
Video Cards: NVIDIA Titan V
NVIDIA Titan Xp
NVIDIA GeForce GTX Titan X (Maxwell)
AMD Radeon RX Vega 64
Video Drivers: NVIDIA: Release 390.30 for Linux x64
AMD:
OS: Ubuntu 16.04.4 LTS

With deep learning benchmarking requiring some extra hardware, we must give thanks to the following that made this all happen.

Many Thanks To...

Much thanks to our patient colleagues over at Tom's Hardware, for both splitting custody of the Titan V and lending us their Titan Xp and Quadro P6000. None of this would have been possible without their support.

And thank you to G.Skill for providing us with a 64GB set of DDR4 memory suitable for deep learning workloads, not a small feat in these DDR4 price-inflated times. G.Skill has been a long-time supporter of AnandTech over the years, for testing beyond our CPU and motherboard memory reviews. We've reported on their high capacity and high-frequency kits, and every year at Computex G.Skill holds a world overclocking tournament with liquid nitrogen right on the show floor.

Further Reading: AnandTech's Memory Scaling on Haswell Review, with G.Skill DDR3-3000

Methodology & Testing: Deep Learning Edition DeepBench Training: GEMM & RNN
Comments Locked

65 Comments

View All Comments

  • SirCanealot - Tuesday, July 3, 2018 - link

    No overclocking benchmarks. WAT. ¬_¬ (/s)

    Thanks for the awesome, interesting write up as usual!
  • Chaitanya - Tuesday, July 3, 2018 - link

    This is more of an enterprise product for consumers so even if overclocking it enabled its something that targeted demographic is not going to use.
  • Samus - Tuesday, July 3, 2018 - link

    wooooooosh
  • MrSpadge - Tuesday, July 3, 2018 - link

    He even put the "end sarcasm" tag (/s) to point out this was a joke.
  • Ticotoo - Tuesday, July 3, 2018 - link

    Where oh where are the MacOS drivers? It took 6 months to get the pascal Titan drivers.
    Hopefully soon
  • cwolf78 - Tuesday, July 3, 2018 - link

    Nobody cares? I wouldn't be surprised if support gets dropped at some point. MacOS isn't exactly going anywhere.
  • eek2121 - Tuesday, July 3, 2018 - link

    Quite a few developers and professionals use Macs. Also college students. By manufacturer market share Apple probably has the biggest share, if not then definitely in the top 5.
  • mode_13h - Tuesday, July 3, 2018 - link

    I doubt it. Linux rules the cloud, and that's where all the real horsepower is at. Lately, anyone serious about deep learning is using Nvidia on Linux. It's only 2nd-teir players, like AMD and Intel, who really stand anything to gain by supporting niche platforms like Macs and maybe even Windows/Azure.

    Once upon a time, Apple actually made a rackmount OS X server. I think that line has long since died off.
  • Freakie - Wednesday, July 4, 2018 - link

    Lol, those developers and professionals use their Macs to remote in to their compute servers, not to do any of the number crunching with.

    The idea of using a personal computer for anything except writing and debugging code is next to unheard of in an environment that requires the kind of power that these GPUs are meant to output. The machine they use for the actual computations are 99.5% of the time, a dedicated server used for nothing but to complete heavy compute tasks, usually with no graphical interface, just straight command-line.
  • philehidiot - Wednesday, July 4, 2018 - link

    If it's just a command line why bother with a GPU like this? Surely integrated graphics would do?

    (Even though this is a joke, I'm not sure I can bear the humiliation of pressing "submit")

Log in

Don't have an account? Sign up now