PCI-Express Compliance: Does It Even Matter?

For a while now we’ve been under the impression that video card size and power consumption was ultimately capped by the PCI-Express specification. At present time the specification and its addendums specify normal (75W), 150W, 225W, and 300W PCIe card operation. In the case of 300W cards in particular this is achieved through 75W from the PCIe slot, 75W from a 6pin PCIe power connector, and 150W from an 8pin PCIe power connector. As the name implies, the PCIe specification also defines what the 6pin and 8pin power connectors are supposed to be capable of, which is where 75W and 150W come from respectively.

Altogether the biggest, most powerful card configuration in the PCIe specification allows for a 12.283” long, triple-wide card that consumes 300W. To date we’ve never seen a card exceed the physical specifications, but we’ve seen several cards exceed the electrical specifications. This includes cards such as the 5970 and some overclocking-oriented 5870s that were designed to handle more than 300W when overclocked, and even more exotic cards such as the Asus ARES 5870X2 that simply drew more than 300W from the get-go. We have yet to see a reference design from AMD/NVIDIA however that exceeds any part of the PCIe specification by default.

So it has been clear for some time now that cards can exceed the PCIe specifications without incurring the immediately wrath of an army of lawyers, but at the same time this doesn’t establish what the benefits or losses are of being or not being PCIe compliant. To have a reference design exceed the PCIe specifications is certainly a new mark for the GPU industry, so we decided to get right to the bottom of the matter and get an answer to the following question: does PCI-Express compliance matter?

To answer this question we went to two parties. The first of which was of course AMD, whose product is in question. AMD’s answer basically amounts to a polite deflection: it’s an ultra-enthusiast card that at default settings does not exceed the power available by the combination of the PCIe slot and PCIe power connectors. Furthermore, as they correctly note, the 6990 is not the first card to ship at over 300W, as the ARES and other cards were drawing more than 300W a year ago. It’s a polite answer that glosses over the fact that no, the 6990 isn’t technically PCIe compliant.

To get a second opinion on the matter we went straight to the source: The Peripheral Component Interconnect Special Interest Group (PCI-SIG), which is the industry group that defines the PCIe standard and runs the workshops that test for product compliance. The PCI-SIG’s member list is virtually everyone in the computing industry, including AMD, NVIDIA, and Intel, so everyone has some level of representation with the group.

So what does the PCI-SIG think about cards such as the 6990 which exceed the PCIe specification? In a nutshell, they don’t directly care. The group’s working philosophy is closer to approving cards that work than it is about strictly enforcing standards, so their direct interest in the matter is limited. The holy grail of the PCI-SIG is the PCI Express Integrators List, which lists all the motherboards and add-on cards that have passed compliance testing. The principal purpose of the list is to help OEMs and system integrators choose hardware, relying on the list and by extension PCI-SIG testing to confirm that the product meets the PCIe standards, so that they can be sure it will work in their systems.

The Integrators List is more or less exclusively OEM focused, which means it has little significance for niche products such as the 6990 which is split between end-user installation and highly customized OEM builds. The 6990 does not need to be on the list to be sold to its target market. Similarly the 5970 was never submitted/approved for listing, and we wouldn’t expect the 6990 to be submitted either.

It is worth noting however that while the PCI-SIG does have power specifications, they’re not a principal concern of the group and they want to avoid doing anything that would limit product innovation. While the 300W specification was laid out under the belief that a further specification would not be necessary, the PCI-SIG does not even test for power specification compliance under their current compliance testing procedures.  Conceivably the 6990 could be submitted and could pass the test, leading to it being labeled PCIe compliant. Of course it’s equally conceivable that the PCI-SIG could start doing power compliance testing if it became an issue…

At the end of the day as the PCI-SIG is a pro-compliance organization as opposed to being a standard-enforcement organization, there’s little to lose for AMD or their partners by not being compliant with the PCIe power specifications. By not having passed compliance testing the only “penalty” for AMD is that they cannot claim the 6990 is PCIe compliant; funny enough they can even use the PCIe logo (we’ve already seen a Sapphire 6990 box with it). So does PCIe compliance matter? For mainstream products PCIe compliance matters for the purposes of getting OEM sales; for everything else including niche products like the 6990, PCIe compliance does not matter.

Once Again The Card They Beg You To Overclock New Catalyst Control Center Features & The Test
Comments Locked

130 Comments

View All Comments

  • Figaro56 - Tuesday, March 8, 2011 - link

    2 HD 6970 Cards for $640? I don't think so! These cards are over $300 everywhere. I purchased 2 for $710 shipped and I thought that was a deal. Maybe reviews like yours here inflated the price and I purchased after the price adjustment. I have the same luck with gasoline on days I fill my tank.
  • ViRGE - Tuesday, March 8, 2011 - link

    Looking at the Egg, there's 2 different 6970s at $320, which is probably where AT got $640 from.

    http://www.newegg.com/Product/Product.aspx?Item=N8...
  • Figaro56 - Tuesday, March 8, 2011 - link

    All right, you got me there. I only buy XFX double lifetime warranty cards when I start spending this much on replacing my dual GPU solution.

    I seem to manage to actually re-sell my used video cards when I can offer then to a buyer with a lifetime warranty. XFX double lifetime warranty is not a sales gimic, it works. Heck, I would buy a used card if it had a lifetime warranty, it's kind of a no brainer given you actually want to buy that card int he first place.
  • Arbie - Tuesday, March 8, 2011 - link

    Thanks for keeping the Crysis Warhead minimum FPS charts!! To me, Crysis/Warhead remains the defining game (and not only technically). I don't even look at the numbers on the other titles.

    Also of prime importance to me are the idle power and, to a slightly lesser extent, idle noise.

    Of course, like most people reading your review, I wouldn't be buying a 6990 even if it were silent. In fact, given that PC graphics requirements are apparently ramping down to console levels, I wonder how AMD/Nvidia are going to sell any significant number of cards above midrange. My HD 5770 will run everything at 1920x1200, though not always with all sliders maxed. However, I don't see much if any difference (in DX9) when I do enable 4xAA vs 2xAA etc. Certainly not enough to double the price of this $140 card.

    A nit on the Crysis Warhead minimum fps chart for 1920x1200 Frost Bench - Gamer Quality - Enthusiast Shaders + 4xAA: Your Dec 10 chart shows 6970CF at 66.2 fps but this Mar 11 chart shows 66.6. Can you believe anyone would actually notice this, much less comment on it? We are too absorbed in this tech stuff (ain't it grand...).
  • strikeback03 - Tuesday, March 8, 2011 - link

    They did say the new drivers made a slight difference, that seems likely to be one of the configurations they retested
  • morphologia - Tuesday, March 8, 2011 - link

    That isn't portrait orientation in the picture...it's landscape.
  • taltamir - Tuesday, March 8, 2011 - link

    The card was measured at 77.3db in the article.
    1. At what distance was it measured?
    2. What is its db measurement 1 meter away?
  • taltamir - Tuesday, March 8, 2011 - link

    I just looked it up, gold is worth 1430$/ounce right now.
    I highly doubt a watercooled 6990 will weigh half an ounce.
  • ekrash - Tuesday, March 8, 2011 - link

    The performance bottleneck is also seen in nvidia's dual gpu offerings. Dual GPU cards operating in X16 PCIe slots must have their data lanes divided between the gpu's, so they are effectively operating at X8 data rates, not at X16 data rates. Whereas single gpu cards will utilize all X16 express lanes, and even then the PCIexpress standard may soon be obsoleted. I hope we can look forward to Intel's fiber optic technology effectively replacing all data bus signalling with 10GB fiber optic bus and peripheral device signalling which can simultaneously and independently utilize all of the different data protocols used for inter-device and system bus communications. Imagine soon AMD and Nvidia will be producing video cards with fiber-optic data buses which may change requirements for power supplied to present day PCI express slots and may change the standards in power supply manufacturing to require that additional power connector to a video card since the 75 watt PCIe slot will be obsolete.

    But ATI and Nvidia may also have to work with motherboard manufacturers to see if Intel's "Thunderbolt" fiber optic data buses can increase or freely throttle the video data bandwidth through its 10GB interface and would be tantamount to increasing data lanes from X16 to X32. It would be almost unlimited video bandwidth which far exceeds any bandwidth limitations vs availability that is needed today. Dual GPU's cannot promise the performance with the limitation of the PCIe X16 slot being divided to dual X8 channels, but it would be nice to see how they perform with unlimited bandwidth potential over a single 10GB fiber-optic. And that would change the battlefield between ATI-AMD and Nvidia.

    My 4870 X2's (Can run Quadfire) still rocks on enthusiast settings in Crysis and Warhead without any hiccups and I've not seen a slowdown of any sort on any level in Crysis.
    The price to performance ratio is declining and may affect my decision to purchase another dual GPU card, opting instead for single GPU card CF solutions that can utilize all X16 lanes by the GPU.

    BTW I did notice the lack of DATA on Crysis @1920x1200 with full enthusiast settings, so that data is missing from this review. Its Gamer plus enthusiast shaders.....not full enthusiast. As above the 4870 X2 runs full enthusiast settings, not one setting is scaled back, and not one hiccup....just smooth play throughout on a single 28" display.
  • cmdrdredd - Tuesday, March 8, 2011 - link

    Why are we still using Crysis Warhead at "Gamer Quality"????? With cards like these why not turn everything maxed in game and then fidget with AA and the like? I don't get it.

Log in

Don't have an account? Sign up now