XFX Radeon R9 280X Double Dissipation

The first of our R9 280X cards is our reference-like sample, XFX’s Radeon R9 280X Double Dissipation. The R9 280X DD is XFX’s sole take on the 280X, utilizing a new and apparently significantly revised version of XFX’s Double Dissipation cooler, and paired with a 280X operating at the 280X’s reference clocks of 850MHz core, 1000MHz boost, and 6GHz RAM. Since there isn’t an overclock here, XFX will primarily be riding on their cooler, build quality, and other value add features.

Diving right into the design of the card, the R9 280X DD is a fairly traditional open air cooler design, as is common for cards in this power and price range. This basic design is very effective in moving large amounts of heat for relatively little noise, making the usual tradeoff of moving some of the cooling workload onto the system’s chassis (and its larger, slower fans) rather than doing the work entirely on its own.

In XFX’s case this is a new design, having forgone their older Double Dissipation design that we first saw on their 7970BEDD back in 2012.  What’s changed? Without going into minute details, practically everything. At first glance you’re unlikely to even recognize this as an XFX card due to the fact that this is the first such product from XFX using this design, which aesthetically looks almost nothing like their old design.

First and foremost XFX has gone for the oversized cooler approach, something that’s become increasingly common as of late, equipping the card with one of the larger coolers we’ve ever seen. At 100mm in diameter the two fans on XFX’s design are among the biggest we’ve ever seen, pushing the card to just over 11.1 inches long while causing the heatsink and shroud to stand about 0.75” taller than the board itself.

Drilling down, XFX is using a two segment heatsink, the combined length of which runs the complete length of the card. Providing heat conduction between the GPU and the heatsink is a set of 6 copper heatpipes mounted into a copper base plate. 4 of these heatpipes run towords the rear of the card and the other 2 to the front, perpendicular to XFX’s vertical fin heatsink. Meanwhile cooling for the various discrete components on the board, including the memory, is provided by a separate cut-out baseplate that covers most of the card. There isn’t any kind of connection between the baseplate and the heatsink proper, so it’s the baseplate and any airflow over it that’s providing cooling for the MOSFETs it covers.

Moving on to XFX board, it looks like XFX isn’t doing anything particularly exotic here. XFX is using their standard Duratec high-end components, which includes using solid caps and chokes (typical for all cards in this power category) along with their IP-5X dust free fan. A quick component count has us counting 7 power phases, which would be the reference amount for a 280X, meaning we’re looking at 5 phases for the GPU, and another 2 phases for the memory and I/O.

Meanwhile for I/O XFX implements the common Radeon display I/O configuration of 2x DL-DVI, 1x HDMI, and 2x Mini DisplayPort 1.2. All the while external power delivery is provided by a set of 6pn + 8pin power connectors, as to be expected for a 250W card. With that in mind XFX’s design should have at least some overclocking headroom, but XFX doesn’t provide any overclocking software so you’ll need to stick with Catalyst Overdrive or 3rd party utilities such as MSI Afterburner.

Finally, as a Double Dissipation product the 280X DD is covered by XFX’s lifetime warranty policy, contingent on registering the card within 30 days of purchase. Interestingly XFX remains one of the few board partners that still offers any kind of lifetime warranty, making them fairly exceptional in that regard. As for pricing we’re listing the XFX card at $329 at the moment, though there is still some confusion over whether that’s the final price or not as our XFX rep seemed unsure of that. As is sometimes the case in this industry, we get the impression that they were waiting to see what other manufacturers were going to charge, in which case we suspect the actual launch price will be lower than that. We’ll update this article once we have final pricing information available.

Launching This Week: Radeon R9 280X Asus Radeon R9 280X DirectCU II TOP
Comments Locked

151 Comments

View All Comments

  • alfredska - Wednesday, October 9, 2013 - link

    Yes, I made a couple mistakes in trimming the fat from Ryan's writing. I should have done another proof-read myself. This wasn't the point of my post, though.

    Ryan's review is littered with sentence pauses that drastically slow down your ability to read the article. Some examples are: starting too many sentences with words like "ultimately" or "meanwhile"; making needless references to earlier statements; using past or present perfect tense when just past or present tense is appropriate. I wrote the above example hoping that Ryan would put it next to his own writing and see whether he can 1) read it faster, and 2) retain more information from this version.

    I can accept a misspelling here and there and even some accidental word injections of which I was guilty. The fluidity needs work though. If the reader cannot glide easily between paragraphs, they will stop reading and just look at pictures.
  • chuck.norris.and.son - Tuesday, October 8, 2013 - link

    tl:dr :( blablabla

    Can't you nail it down: AMD or Nvidia? Which GFX card should i buy to play Blockbuster like BF 4?
  • ShieTar - Tuesday, October 8, 2013 - link

    How about not buying any new GFX card and investing the savings into books in order to improve your reading skills?
  • Will Robinson - Tuesday, October 8, 2013 - link

    Radeon 280X will be the sweet spot card to get for BF4.
    R9 290X will be the open class champ over GTX780 I suspect.
  • piroroadkill - Tuesday, October 8, 2013 - link

    The best AMD card you can buy with your money, simply because Battlefield 4 will eventually feature the Mantle renderer which is for GCN cards only, and will probably be a killer feature.
  • hrga - Tuesday, October 8, 2013 - link

    Most moronic branding ever (at least the one to be overrun). They cutout vanilla 7850 or top end 7950 from the HD7000 lineup and call it with confusing R9/R7, unrealistically stupid marketing where nothing material stand behind those names.

    Another rebranding?
    - Yes. [thinking. What the heck did you expect guys]

    Not most successful?
    - [thinking. depends on POV] Well, it's here to milk the most cash as our CPU business didnt produced anything valuable for three years. And we also must have something interesting to present in our slideshow presentation for investors. If we couldn't afford to produce whole new lineup we could always produce yet another rebranded line just like nvidia. We always learn from our (cartel) competition, and customers don't seem to have any objections on that matter.

    So that's why you retain those moronically high prices?
    - We just adjust that according to our competition (cartel)

    But you never lower prices for HD7870 which today celebrates its second birthday and is produced on highly matured 28nm for at least six month. Instead you just rebranded it for second time after HD8860, so now we have R9 270X too. Don't you think you're customers would like to see some new designs while putting old products on discount prices?

    Or at least you could introduce that R9 270, which is same old HD7870, with lower prices than todays HD7870 retail prices are?!
    Instead of higher up prices for same performance (source Newgg http://imageshack.dk/imagesfree/xLh43741.jpg).
    And why the heck R9 desination for this mediocre mainstream product?! You could weaselishly sell this c-rap at the end of 2011, but "Hello AMD!" It the end of 2013.

    Pitcairn used in HD7800/HD8800 seriesis is smaller chip than Evergreen in HD5800, which only three years ago was produced on troublesome early 40nm process while this is two year old design now produced on highly mature TSMC 28nm-HK node for at least six month with far better yields? HD5850 had same or even lower prices at EOL (only year after introduction) than todays two year old Pitcairn desing. How do you explain that?
    - Well ...Milking you know ... When you have good cartel environment like we have competing with nvidia we could sky rocket prices. And you know even crappy Intels Knights Corner chips today produced at 22nm would be any cheaper because Intel knows how to milk moneys on their tick-tock performance introductions and they certainly would gave up that experience in case of "Chip Previously Known as Larabee" (CPKL)
  • labodhibo - Tuesday, October 8, 2013 - link

    Must read this.. totally different perspective:
    http://www.techspot.com/review/722-radeon-r9-270x-...
  • AssBall - Tuesday, October 8, 2013 - link

    Well done review. I kinda like what Asus did with its 280x version.

    Typo: "Asus calls it “CoolTech” and it’s essentially an effort to build a fan that’s blow a an axial fan and a blower (radial) fan at the same time,"
    [blow -> both?]
  • zlandar - Tuesday, October 8, 2013 - link

    This is why I wanted the Asus 770 card also in the recent 770 GTX roundup. The cooler design seems superior for single GPU purposes as long as you have the room for it in your case.
  • AxialLP7 - Tuesday, October 8, 2013 - link

    Not trying to be AFC, just want to make sense of this: "Asus calls it “CoolTech” and it’s essentially an effort to build a fan that’s blow a an axial fan and a blower (radial) fan at the same time, explaining the radial-like center and axial-like outer edge of the fan." Can someone help? This is in the "ASUS RADEON R9 280X DIRECTCU II TOP" section...

Log in

Don't have an account? Sign up now