Seeing the Future: DisplayPort 1.2

While Barts doesn’t bring a massive overhaul to AMD’s core architecture, it’s a different story for all of the secondary controllers contained within Barts. Compared to Cypress, practically everything involving displays and video decoding has been refreshed, replaced, or overhauled, making these feature upgrades the defining change for the 6800 series.

We’ll start on the display side with DisplayPort. AMD has been a major backer of DisplayPort since it was created in 2006, and in 2009 they went as far as making DisplayPort part of their standard port configuration for most of the 5000 series cards. Furthermore for AMD DisplayPort goes hand-in-hand with their Eyefinity initiative, as AMD relies on the fact that DisplayPort doesn’t require an independent clock generator for each monitor in order to efficiently drive 6 monitors from a single card.

So with AMD’s investment in DisplayPort it should come as no surprise that they’re already ready with support for the next version of DisplayPort, less than a year after the specification was finalized. The Radeon HD 6800 series will be the first products anywhere shipping with DP1.2 support – in fact AMD can’t even call it DP1.2 Compliant because the other devices needed for compliance testing aren’t available yet. Instead they’re calling it DP1.2 Ready for the time being.

So what does DP1.2 bring to the table? On a technical level, the only major change is that DP1.2 doubles DP1.1’s bandwidth, from 10.8Gbps (8.64Gbps video) to 21.6Gbps (17.28Gbps video); or to put this in DVI terms DP1.2 will have roughly twice as much video bandwidth as a dual-link DVI port. It’s by doubling DisplayPort’s bandwidth, along with defining new standards, that enable DP1.2’s new features.

At the moment the feature AMD is touting the most with DP1.2 is its ability to drive multiple monitors from a single port, which relates directly to AMD’s Eyefinity technology. DP1.2’s bandwidth upgrade means that it has more than enough bandwidth to drive even the largest consumer monitor; more specifically a single DP1.2 link has enough bandwidth to drive 2 2560 monitors or 4 1920 monitors at 60Hz. Furthermore because DisplayPort is a packet-based transmission medium, it’s easy to expand its feature set since devices only need to know how to handle packets addressed to them. For these reasons multiple display support was canonized in to the DP1.2 standard under the name Multi-Stream Transport (MST).

MST, as the name implies, takes advantage of DP1.2’s bandwidth and packetized nature by interleaving several display streams in to a single DP1.2 stream, with a completely unique display stream for each monitor. Meanwhile on the receiving end there are two ways to handle MST: daisy-chaining and hubs. Daisy-chaining is rather self-explanatory, with one DP1.2 monitor plugged in to the next one to pass along the signal to each successive monitor. In practice we don’t expect to see daisy-chaining used much except on prefabricated multi-monitor setups, as daisy-chaining requires DP1.2 monitors and can be clumsy to setup.

The alternative method is to use a DP1.2 MST hub. A MST hub splits up the signal between client devices, and in spite of what the name “hub” may imply a MST hub is actually a smart device – it’s closer to a USB hub in that it’s actively processing signals than it is an Ethernet hub that blindly passes things along. The importance of this distinction is that the MST hub does away with the need to have a DP1.2 compliant monitor, as the hub is taking care of separating the display streams and communicating to the host via DP1.2. Furthermore MST hubs are compatible with adaptors, meaning DVI/VGA/HDMI ports can be created off of a MST hub by using the appropriate active adaptor. At the end of the day the MST hub is how AMD and other manufacturers are going to drive multiple displays from devices that don’t have the space for multiple outputs.

For Barts AMD is keeping parity with Cypress’s display controller, giving Barts the ability to drive up to 6 monitors. Unlike Cypress however, the existence of MST hubs mean that AMD doesn’t need to dedicate all the space on a card’s bracket to mini-DP outputs, instead AMD is using 2 mini-DP ports to drive 6 monitors in a 3+3 configuration. This in turn means the Eyefinity6 line as we know it is rendered redundant, as AMD & partners no longer need to produce separate E6 cards now that every Barts card can drive 6 DP monitors. Thus as far as AMD’s Eyefinity initiative is concerned it just became a lot more practical to do a 6 monitor Eyefinity setup on a single card, performance notwithstanding.

For the moment the catch is that AMD is the first company to market with a product supporting DP1.2, putting the company in a chicken & egg position with AMD serving as the chicken. MST hubs and DP1.2 displays aren’t expected to be available until early 2011 (hint: look for them at CES) which means it’s going to be a bit longer before the rest of the hardware ecosystem catches up to what AMD can do with Barts.

Besides MST, DP1.2’s bandwidth has three other uses for AMD: higher resolutions/bitdepths, bitstreaming audio, and 3D stereoscopy. As DP1.1’s video bandwidth was only comparable to DL-DVI, the monitor limits were similar: 2560x2048@60Hz with 24bit color. With double the bandwidth for DP1.2, AMD can now drive larger and/or higher bitdepth monitors over DP; 4096x2160@50Hz for the largest monitors, and a number of lower resolutions with 30bit color. When talking to AMD Senior Fellow and company DisplayPort guru David Glen, higher color depths in particular came up a number of times. Although David isn’t necessarily speaking for AMD here, it’s his belief that we’re going to see color depths become important in the consumer space over the next several years as companies look to add new features and functionality to their monitors. And it’s DisplayPort that he wants to use to deliver that functionality.

Along with higher color depths at higher resolutions, DP1.2 also improves on the quality of the audio passed along by DP. DP1.1 was capable of passing along multichannel LPCM audio, but it only had 6.144Mbps available for audio, which ruled out multichannel audio at high bitrates (e.g. 8 channel LPCM 192Khz/24bit) or even compressed lossless audio. With DP1.2 the audio channel has been increased to 48Mbps, giving DP enough bandwidth for unrestricted LPCM along with support for Dolby and DTS lossless audio formats. This brings it up to par with HDMI, which has been able to support these features since 1.3.

Finally, much like how DP1.2 goes hand-in-hand with AMD’s Eyefinity initiative, it also goes hand-in-hand with the company’s new 3D stereoscopy initiative, HD3D. We’ll cover HD3D in depth later, but for now we’ll touch on how it relates to DP1.2. With DP1.2’s additional bandwidth it now has more bandwidth than either HDMI1.4a or DL-DVI, which AMD believes is crucial to enabling better 3D experiences. Case in point, for 3D HDMI 1.4a maxes out at 1080p24 (48Hz total), which is enough for a full resolution movie in 3D but isn’t enough for live action video or 3D gaming, both of which require 120Hz in order to achieve 60Hz in each eye. DP1.2 on the other hand could drive 2560x1600 @ 120Hz, giving 60Hz to each eye at resolutions above full HD.

Ultimately this blurs the line between HDMI and DisplayPort and whether they’re complimentary or competitive interfaces, but you can see where this is going. The most immediate benefit would be that this would make it possible to play Blu-Ray 3D in a window, as it currently has to be played in full screen mode when using HDMI 1.4a in order to make use of 1080p24.

In the meantime however the biggest holdup is still going to be adoption. Support for DisplayPort is steadily improving with most Dell and HP monitors now supporting DisplayPort, but a number of other parties still do not support it, particular among the cheap TN monitors that crowd the market these days. AMD’s DisplayPort ambitions are still reliant on more display manufacturers including DP support on all of their monitors, and retailers like Newegg and Best Buy making it easier to find and identify monitors with DP support. CES 2011 should give us a good indication on how much support there is for DP on the display side of things, as display manufacturers will be showing off their latest wares.

Barts: The Next Evolution of Cypress Seeing the Present: HDMI 1.4a, UVD3, and Display Correction
Comments Locked

197 Comments

View All Comments

  • GeorgeH - Friday, October 22, 2010 - link

    WRT comments complaining about the OC 460 -

    It's been clear from the 460 launch that a fully enabled and/or higher clocked 460 would compete very well with a 470. It would have been stupid for NVIDIA to release such a card, though - it would have made the already expensive GF100 even more so by eliminating a way to get rid of their supply of slightly defective GF100 chips (as with the 465) and there was no competitive reason to release a 460+.

    Now that there is a competitive reason to release one, do you really think Nvidia is going to sit still and take losses (or damn close to it) on the 470 when it has the capability of launching a 460+? Do you really think that Nvidia still can't make fully functional GF104 chips? Including the OC 460 is almost certainly Ryan's way of hinting without hinting (NDAs being what they are) what Nvdia is prepping for release.

    (And if you really think AT is anyone's shill, you're obviously very new to AT.)
  • AnandThenMan - Friday, October 22, 2010 - link

    "And if you really think AT is anyone's shill, you're obviously very new to AT."

    Going directly against admitted editorial policy doesn't exactly bolster your argument now does it. As for your comment about a 460+ or whatever you were trying to say, who cares? Reviews are supposed to be about hardware that is available to everyone now, not some theoretical card in the future.
  • MGSsancho - Friday, October 22, 2010 - link

    A vendor could just as likely sell an overclocked 470 card as well as a 480. But I think you made the right assumption that team green might be releasing overclocked cards that all have a minimum of 1gb of ram to make it look like their cards are faster than team red's. maybe it will be for near equal price points, the green cards will all be 20~30% overclocked to make it look like they are 10% faster than the red offerings at similar prices. Red cards could just be sold over clocked as well (we have to wait a bit more to see how well they overclock). All of this does not really matter. In the end of the day, buyers will look at whats the fastest product they can purchase at their price point. Maybe secondly they will notice that hey this thing gets hot and is very loud and just blindly blaming the green/red suits and thirdly they will look at features. Who really knows.

    Personally I purchase the slightly slower products then over clock them myself if i find a game that needs it. I would rather have the headroom vs buying a card that is always going to be hot enough to rival volcanoes even if it is factory warrantied.
  • Golgatha - Friday, October 22, 2010 - link

    The nVidia volcanoes comment is really, really overstated. I have a mid-tower case with a 120mm exhaust and 2x92mm intakes (Antec Solo for reference), and a GTX 480. None of these case fans are high performance fans. Under very stressful gaming conditions, I hit in the 80-85°C range, and Folding@Home's GPU3 client will get it up to 91°C under 100% torturous load.

    Although I don't like the power consumption of the GTX 480 for environmental reasons, it is rock solid stable, has none of the drawbacks of multi-GPU setups (I actually downgraded from a Crossfire 5850 setup due to game crashing and rendering issues), and it seems to be top dog in a lot of cases when it comes to minimum FPS (even when compared to multi-GPU setups).
  • Parhel - Friday, October 22, 2010 - link

    "And if you really think AT is anyone's shill, you're obviously very new to AT"

    I think you're referring to me, since I'm the one who used the word "shill." Let me tell you, I've been reading AT since before Tom's Hardware sucked, and that's a loooong time.

    If I were going to buy a card today, I'd buy the $180 GTX 460 1GB, no question. I'm not an AMD fan, nor am I an NVidia fan. I am, however, an Anandtech fan. And their decision to include the FTW edition card in this review means I can no longer come here and assume I'm reading something something unbiased and objective.
  • GeorgeH - Friday, October 22, 2010 - link

    It was actually more of a shotgun blast aimed at the several silly posts implying AT was paid off by EVGA or Nvidia.

    If you've been reading AT for ~10 years, why would you assume that Ryan (or any other longtime contributor) suddenly decided to start bowing to outside pressure? If you stop lighting the torches and sharpening the pitchforks for half a second, you might realize that Ryan probably has a very good reason for including the OC card.

    Even if I'm smoking crack WRT a GTX460+, what's the point of a review? It's not to give AMD and Nvidia a "fair" fight, it's to give us an idea of the best card to spend our money on - and if AMD or Nvidia get screwed in the process, I'm not going to be losing any sleep.

    Typically, OC cards with a significant clock bump are fairly rare "Golden samples" and/or only provide marginal performance benefits without significantly increasing heat, noise, and power consumption. With the 460, Nvidia all but admitted they could've bumped the stock clocks quite significantly, but didn't want to threaten their other cards (*cough* 470 *cough*) if they didn't have to. This is reflected in what you can actually buy at Newegg - of the ~30 1GB 460's, only ~5 are running stock. 850MHz is still high, but is also right in line with the average of what you can expect any 460 to get to, so I don't think it's too far out of place.

    Repeating what I said above, including the OC card was unfair to AMD, but is highly relevant to me and my wallet. I couldn't care less if AMD (or Nvidia) get screwed by an AT review - I just want to know what's best for me, and this article delivers. If the tables were turned, I'm sure that Ryan would have no problem including an OC AMD card in a Nvidia review - because it isn't about being a shill, it's about informing me, the consumer.
  • SandmanWN - Friday, October 22, 2010 - link

    What? Put the crack down... Really, if you are short on time to review a product and you steal time away from that objective just to review a specially delivered hand selected opponents card instead of completing your assignment then you've not exactly been genuine to your readers or in this case to AMD.

    If you have time to add in an overclocked card then you need to do the same with the review card, otherwise the OC'd cards need to wait another day.

    I have no idea how you can claim some great influence on your wallet when you have no idea of the OC capabilities of the 6000 series. If you actually bought the 460 off this review then you are banking that the overclock will hold up against a unknown variable. That's not exactly relevant to anyone's wallet.
  • GeorgeH - Friday, October 22, 2010 - link

    An OC'd 460 competes with the 6870, and the 6870 doesn't really overclock at all.

    Even overclocked, a 6850 isn't going to touch a 6870, unless you're going to well over 1GHz (which short of a miracle isn't going to happen.)

    It was disappointing that the review wasn't fleshed out more, but I'd say what's missing isn't as relevant to my buying decisions as how well the plethora of OC'd 460s compare to the 6870.
  • Parhel - Saturday, October 23, 2010 - link

    "the 6870 doesn't really overclock at all"

    What? You're talking out of your ass No review site has even attempted a serious overclock yet. It's not even possible, as far as I know, to modify the voltage yet! We have no way to gauge how these cards overclock, and won't for several weeks.

    "850MHz is still high, but is also right in line with the average of what you can expect any 460 to get to"

    Now you're sounding like the shill. 850Mhz is not a realistic number if we're talking about 24/7 stability with stock cooling. No way.
  • GeorgeH - Saturday, October 23, 2010 - link

    850MHz unrealistic? Nvidia flat out admitted that most cards are capable of at least ~800MHz (no volt mods, no nothing) and reviews around the web have backed this up, showing low to mid 800's on most stock cards, at stock voltages, running stock cooling. If you're worried about reliability, grab one of the many cards that come factory OC'd with a warranty.

    The 6870 doesn't now and never will overclock much at all, at least not in the way the 460 does. As with any chip, there will be golden sample cards that will go higher with voltage tweaks and extra cooling, but AMD absolutely did not leave ~20-25% of the 6870's average clockspeed potential on the table. The early OC reviews back this up as well, showing the 6870 as having minimal OC'ing headroom at stock voltages.

    If you're waiting to compare the maximum performance that you can stretch out of a cherry-picked 6870 with careful volt mods and aftermarket cooling, you're going to be comparing it with a 460 @ ~950MHz, not ~850MHz.

    As a guess, I'd say that your ignorance of these items is what led you to be so outraged at the inclusion of the OC 460 in the review. The magnitude of the OC potential of the 460 is highly atypical (at least in mid-range to high end cards), which is why I and many other posters have no issue with its similarly atypical inclusion in the review.

Log in

Don't have an account? Sign up now