At the launch of Intel's LGA-2011 based Sandy Bridge E CPU we finally had a platform capable of supporting PCI Express 3.0, but we lacked GPUs to test it with. That all changed this past week as we worked on our review of the Radeon HD 7970, the world's first 28nm GPU with support for PCIe 3.0.

The move to PCIe 3.0 increases per-lane bandwidth from 500MB/s to 1GB/s. For a x16 slot that means doubling bandwidth from 8GB/s under PCIe 2.1 to 16GB/s with PCIe 3.0. As we've seen in earlier reviews and our own internal tests, there's hardly any difference between PCIe 2.1 x8 and x16 for modern day GPUs. The extra bandwidth of PCIe 3.0 wasn't expected to make any tangible difference in gaming performance and in our 7970 tests, it didn't.

Why implement PCIe 3.0 at all then? For GPU compute. Improving bandwidth and latency between the CPU and the GPU are both key to building a high performance heterogenous computing solution. While  good GPU compute benchmarks on the desktop are still hard to come by, we did find one that showed a real improvement from PCIe 3.0 support on the 7970: AMD's AES Encrypt/Decrypt sample application. 

Simply enabling PCIe 3.0 on our EVGA X79 SLI motherboard (EVGA provided us with a BIOS that allowed us to toggle PCIe 3.0 mode on/off) resulted in a 9% increase in performance on the Radeon HD 7970. This tells us two things: 1) You can indeed get PCIe 3.0 working on SNB-E/X79, at least with a Radeon HD 7970, and 2) PCIe 3.0 will likely be useful for GPU compute applications, although not so much for gaming anytime soon.

Comments Locked

14 Comments

View All Comments

  • ypsylon - Friday, December 23, 2011 - link

    To be honest, there is no point in rushing for motherboard with Gen3. slots. It isn't worth it. Of course manufacturers will use even bigger letters in marketing slogans to convince people that they need Gen3 right now. But simple truth is that only people running storage of PCI-Ex add-on RAID controllers will notice boost in performance. Because those cards can use additional bandwidth even using older Gen2 tech. Boost is not massive, but certainly noticeable. For gaming right now (and foreseeable future) Gen3 PCI-Ex is as useful as 'snooze' button on a smoke alarm. Even most modern and powerful VGAs barely saturating with data Gen2 x8 slot (no matter in single mode or SLI/QF-ed to the max). And there is Gen4 on the horizon... I wonder what for right now, hardware lagging so far behind standards...
  • B3an - Saturday, December 24, 2011 - link

    You just said PCIE 3.0 isn't worth it but then go on to mention conditions where it WOULD be worth it. So clearly it is worth it for some, including people who do GPU compute. Secondly if someone buys a 3.0 board right now and dont make use of the extra bandwidth they could well use in the future with a GPU upgrade, and not have to buy a new board because they're stuck with PCIE 2.0, so thats a very limited imagination you have. Dont know why anyone would moan about something being more future proof.
  • MrSpadge - Wednesday, December 28, 2011 - link

    Interfaces have seldomly been limiting and have always been used as a marketing argument. Falling for this was dumb in the 90's (and probably earlier) and it still is.

    However, getting the upgrade for free is nothing to complain about. It's rather short-sighted, actually. Stoarge, GPU-Compute, Thunderbolt and multi-GPU gaming all benefit from the upgrade. And considering modern game engines like BF3 where data is continously streamed to the GPU (as storing everything there would simply be too much) there's the possibility the interface will matter even more for single cards in the future.

    MrS
  • Arnulf - Friday, December 23, 2011 - link

    What about the situation with fewer-than-maximum (16 per card) CPI-e lane boards ? Most people use 16-lane boards and some of them opt for SLI/Crossfire. In this case they are limited to 8 lanes per card which allegedly is a noticable (measurable) bottleneck. With doubled throughput of PCI-e 3.0 8/8 lane split will no longer be a bottleneck and those people won't have to shell out heaps of money for an "enthusiast" board and CPU either.

Log in

Don't have an account? Sign up now