Back to Article

  • Gyro231995 - Tuesday, December 24, 2013 - link

    I have a throbbing erection for this card. Reply
  • psyside1 - Tuesday, December 24, 2013 - link

    No vrm temps in the review, kinda no use to show amazing core temps if the vrm's are not measured. Reply
  • rf525256 - Thursday, December 26, 2013 - link

    sterben. Reply
  • blanarahul - Tuesday, December 24, 2013 - link

    Those noise levels. My jaw dropped on the floor. Great job Sapphire. Really great job. No wonder AMD doesn't care about their coolers. Reply
  • blanarahul - Tuesday, December 24, 2013 - link

    I am so surprised that I read the OCing portion again. Wow. Disposing 125 W more than 780 Ti at the exact same noise and temperature. This cooler is.... really something else. Reply
  • jasonelmore - Tuesday, December 24, 2013 - link

    well just look at the cooler. Its very expensive. All those pure copper heatpipes and tons of surface area aluminium. I wouldn't be surprised if this cooler sells for $80 all by it's lonesome. Reply
  • testbug00 - Wednesday, December 25, 2013 - link

    and to think how much the 780/Titan/780ti cooler costs.... Well, the cooler NVidia uses should crush this one based on price. Reply
  • rf525256 - Thursday, December 26, 2013 - link

    sterven Reply
  • krutou - Tuesday, December 24, 2013 - link

    A non-reference cooler beating a reference cooler. This is news? Reply
  • yannigr - Wednesday, December 25, 2013 - link

    Every custom cooler will be news thanks to AMD's reference coolers, Hawaii's performance and the price it sells. 10-15 more 290/290X models with real availability in the market and Nvidia will HAVE to drop 780Ti's price close to $550. 780 should go down to $400. Reply
  • psyside1 - Thursday, December 26, 2013 - link

    So no update for VRM - temps? Reply
  • FookDuSushi - Monday, January 06, 2014 - link

    You wish. Nvidia sucks too much dick to do that. Reply
  • tnypxl - Wednesday, December 25, 2013 - link

    No. It's a review. Reply
  • psyside1 - Friday, December 27, 2013 - link

    So what if its review? Reply
  • yacoub35 - Thursday, December 26, 2013 - link

    It will be even quieter when one or more of the fans dies within three months of ownership, if it's anything like the Sapphire cards I've owned. Reply
  • tteksystems - Friday, December 27, 2013 - link

    All fans can be replaced with even better, quieter ones, as is the practice of almost every enthusiast. I would change all fans to something better or use a water cooler. Cooling does not have to be so complicated, or even sophisticated. Choosing better after-market fans and/or water cooling always solves the cooling issues and is well worth the investment. Reply
  • TheJian - Saturday, December 28, 2013 - link

    Don't forget all of your solutions void the warranty. Most of us don't like that ;) Also if you're going to blow the warranty, why buy a more expensive card like this, just buy ref if warranty means nothing to you. The point of buying this over ref is YOU USE THESE FANS with warranty INTACT.

    One more point, none of this is needed with ANY 780TI card. Just OC to max and have a nice day. No fan mods etc needed. Warranty still good. Sell the 3 AAA games if you don't want them to make up most of the difference and be happy you got a better card that probably won't have a phase2 driver in its future ;)
  • K_Space - Sunday, December 29, 2013 - link

    That's very much anecdotal evidence; I've got Sapphire Crossfire'd HD 5870 Vapor-X alive and kicking since release day sometime ~ 2009. (I still think it's one of the most under-rated cards from the red camp)
    Broadly speaking, Sapphire is very much a reputable company and their build quality is up there amongst AMD board partners. I'm sure this product will b no different.
  • kmmatney - Friday, January 03, 2014 - link

    I have a 560 GTX-Ti Hawk card, with a Twin Frzr III cooler which sounds like an airplane while gaming. I tried to adjust the fan settings to be quieter, but the temps would easily go above 100C. On a whim, I removed the fan cover exposing the heat sink, and zip-tied a quiet 120 mm fan above it. Not only is the card near silent now, but the temps are a good 20 deg cooler. So if you have the space (it will become a tri-slot card) I think any card can have good, silent cooling with this technique. Reply
  • LordOfTheBoired - Wednesday, December 25, 2013 - link

    Our frothing desire for this video card increases! Reply
  • ShieTar - Tuesday, December 24, 2013 - link

    "Curiously, the [idle] power consumption of the 290 Tri-X OC is notably lower than the reference 290."

    Well, it runs about 10°C cooler, and silicone does have a negative temperature coefficient of electrical resistance. That 10°C should lead to a resistance increase of a few %, and thus to a lower current of a few %. Here's some nice article about the same phenomenon observed going from a Stock 480 to an Zotac AMP! 480:

    The author over there was also initially very surprised. Apparently kids these days just don't pay attention in physics class anymore ...
  • EarthwormJim - Tuesday, December 24, 2013 - link

    It's mainly the leakage current which decreases as temperature decreases, which can lead to the reductions in power consumption. Reply
  • Ryan Smith - Tuesday, December 24, 2013 - link

    I had considered leakage, but that doesn't explain such a (relatively) massive difference. Hawaii is not a leaky chip, meanwhile if we take the difference at the wall to be entirely due to the GPU (after accounting for PSU efficiency), it's hard to buy that 10C of leakage alone is increasing idle power consumption by one-third. Reply
  • The Von Matrices - Wednesday, December 25, 2013 - link

    In your 290 review you said that the release drivers had a power leak. Could this have been fixed and account for the difference? Reply
  • Samus - Wednesday, December 25, 2013 - link

    Quality vrms and circuitry optimizations will have an impact on power consumption, too. Lots of factors here... Reply
  • madwolfa - Wednesday, December 25, 2013 - link

    This card is based on reference design. Reply
  • RazberyBandit - Friday, December 27, 2013 - link

    And based does not mean an exact copy -- it means similar. Some components (caps, chokes, resistors, etc.) could be upgraded and still fill the bill for the base design. Some components could even be downgraded, yet the card would still fit the definition of "based on AMD reference design." Reply
  • Khenglish - Wednesday, December 25, 2013 - link

    Yes power draw does decrease with temperature, but not because resistance drops. Resistance dropping has zero effect on power draw. Why? Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster.

    The real reason power draw drops is due to lower leakage. Leakage current is completely unnecessary and is just wasted power.

    Also an added tidbit. The reason performance increases while temperature decreases is mainly due to the wire resistance dropping, not an improvement in the transistor itself. Lower temperature decreases the number of carriers in a semiconductor but improves carrier mobility. There is a small net benefit to how much current the transistor can pass due to temperature's effect on silicon, but the main improvement is from the resistance of the copper interconnects dropping as temperature drops.
  • Totally - Wednesday, December 25, 2013 - link

    Resistance increases with temperature -> Power draw increases P=(I^2)*R. Reply
  • ShieTar - Thursday, December 26, 2013 - link

    The current isn't stabilized generally, the current is: P=U^2/R.

    " Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster."

    Basically correct, nevertheless capacitor charging happens asymptotic, and any IC optimised for speed will not wait for a "full" charge. The design baseline is probably to get the lowest charging required for operation at the highest qualified temperature. Since decreasing temperature will increase charging speed, as you pointed out, you will get to a higher charging ratio, and thus use more power.

    On top of that, the GPU is not exclusively transistors. There is power electronics, there are interconnects, there are caches, and who knows what else (not me). Now when the transistors pull a little more charge due to the higher temperature, and the interconnects which deliver the current have a higher resistance, then you get additional transmission losses. And that's on top of higher leakage rates.

    Of course the equation gets even more fun if you start considering the time constants of the interconnects itself, which have gotten quiet relevant since we got to 32nm structures, hence the high-K materials. Though I have honestly no clue how this contribution is linked to temperature.

    But hey, here's hoping that Ryan will go and investigate the Power drop with his equipment and provide us with a full explanation. As I personally don't own a GPU which gets hot in idle (can't force the fan below 30% by software and won't stop it by hand) I cannot test idle power behavior on my own, but I can and did repeat the Furmark-Test described in the link above, and also see a power-saving of about 0.5W per °C with my GTX660. And thats based on internal power monitoring, so the mainboard/PCIe slot and the PSU should add a bit more to that:
  • Khenglish - Thursday, December 26, 2013 - link

    Yeah I simplified things some. I was just pointing out the main culprits for performance and power scaling with temperature. I think I was over most people's heads anyway so I figured more detail wasn't worth it.

    Also maybe you meant to say something else, but all caches and registers are is a 4 transistor flip-flop (2 inverters in a loop), with 2 more transistors for reading and writing to that exact cell. Power electronics are just bigger FETs. Saying everything in a processor is just FETs and interconnects is very accurate. There really is nothing else.
  • Arbie - Friday, December 27, 2013 - link

    But they do probably learn that ICs aren't built on silicone. Reply
  • Godigy - Tuesday, December 24, 2013 - link

    Great review, Ryan, but could you please update this review with VRM temps (stock idle/load, overclocked idle/load)? It'll show the temps in GPU-Z as VRM1 (the GPU/VRAM phases) and VRM2 (PLL-the trio of MOSFETS near the video connectors).

  • TechFanatic - Tuesday, December 24, 2013 - link

    This is Sapphire telling all AMD fans out there that your patience shall be rewarded.
    You have to remember that while Sapphire is asking for a $50 premium custom cards from other AMD partners will bring the prices down, namely the DirectCUII card from Asus which only adds $20 to the MSRP of a reference R9 290.

    This card is faster, quieter, cooler, has more memory and is less expensive than the 780.
    AMD have got a major winner on their hands.
  • GPU_obsessed - Tuesday, December 24, 2013 - link

    Well you won't see MSRP for r9 290 for still some time. Even so, $550 is a price I'm willing to pay over the $500 780 based on performance. But VRM Temps first ofc. Reply
  • Mondozai - Wednesday, December 25, 2013 - link

    The problems with MSRP prices getting distorted due to the mining craze is a mostly NA-centric phenomenom. The mining craze has reached Sweden, too, the forums at even mainstream, non-tech sites are filled with threads on mining yet we see no distortion on prices here. I think it is less an issue of supply and more about the shameless aspects of American capitalism. Nevertheless, for those of us who do pay MSRP it is a good card. Still waiting for review of the DirectCU II cooler from Asus on this card. Reply
  • blanarahul - Wednesday, December 25, 2013 - link

    If's reviews are to be believed, the Tri-X cards are better than their respective DirectCU2 and Windforce 3X cards. Reply
  • Folterknecht - Wednesday, December 25, 2013 - link

    Computerbase is testing INSIDE A CASE not on an open benchtable. Makes a huge difference. Reply
  • Azurael - Wednesday, December 25, 2013 - link

    Last time I checked, my computer was INSIDE A CASE too. So it seems fairly relevant..,. Reply
  • bigboxes - Thursday, December 26, 2013 - link

    Thank you. Reply
  • madwolfa - Wednesday, December 25, 2013 - link

    Check hardocp. Reply
  • juhatus - Tuesday, December 24, 2013 - link

    Come on, can you move to metric system please or atleast give both inches and centimetres.

    <insert stupid joke about imperial system here>
  • ws3 - Tuesday, December 24, 2013 - link

    There is nothing wrong with inches. They are perhaps just unfamiliar to you.
    Just remember this fact: 1 inch = 25.4 mm exactly. For quick calculations in the head 1 inch = 2.5 cm works well. So a 12 inch card ~ 12*2.5 ~ 30cm.
  • Senti - Tuesday, December 24, 2013 - link

    There is certainly something very wrong with inches: majority of the world uses metric system and hates your inches with a passion.
    How about measurements in international units and you do calculations in the head?
  • Drumsticks - Tuesday, December 24, 2013 - link

    Well Anandtech is a US based privately owned site, so I imagine they can do what they want.

    Don't get me wrong. I'm from america and I hate imperial haha, but saying they should do it to make it easier for their Europe audience is a bit silly. Also, although inhave NO proof for this, its possible that while most of the world uses metric, the majority of their audience comes from America.
  • StevoLincolnite - Thursday, December 26, 2013 - link

    It's the internet.
    Nothing is limited to a typical continents/countries borders and Anandtech serves a world-wide audience, thus by extension it should account for it's world-wide audience.
    Majority of the planet uses metric.

    With that in mind, my country goes by the metric system, but inches is still freely used as a form of measurement so it's no big deal.
  • TheJian - Saturday, December 28, 2013 - link

    By that line of thinking he should be putting his site up in german , chinese, french etc...But no, he doesn't do that because it's an american site. WE SPEAK ENGLISH. If you come to here don't expect me to speak spanish just because we have people migrating from mexico. If you don't like english don't come here. If you don't like english don't read this site ;) It's not his problem people from outside the country try to read here too. Writing for everyone else would just cost more money and lower the amount or quality of news as lots of money would be going to translations etc instead of the actual news/reviews. Reply
  • andy o - Monday, December 30, 2013 - link

    Wow, I was with you for a couple of sentences there, but that quickly turned into a xenophobic and borderly racist rant. "If you don't like english don't come here"? How about if you don't like any of these don't come here? Reply
  • wetwareinterface - Wednesday, December 25, 2013 - link

    the majority of the world uses the metric system but...
    the majority of the readers of this web site don't.

    the majority of the buyers of this level of video card are also in the U.S.

    if you want reviews using the metric system go to another web site that uses it, or just pop the number followed by "inches to cm" in google and get your result instead of commenting that the review is biased against you because of a unit of measurement used in the country the article was authored in.
  • Mondozai - Wednesday, December 25, 2013 - link

    Don't get defensive. The imperial system never made sense which is why international science is all about the metric system. Nevertheless, I am not annoyed personally at Anandtech, an American site using the measurement system they themselves are accustomed too. But that doesn't mean we should have thrown an outdaded system on the trash heap, the imperial system, and done so yesterday. Sorry but it is just a retarded system even if your feelings are hurt. Reply
  • Drumsticks - Wednesday, December 25, 2013 - link

    More or less exactly! Anandtech can do what they want but I do kind of hate our silly system of measurement. It isn't going anywhere sadly Reply
  • mond0 - Wednesday, December 25, 2013 - link

    From his tone, it's clear that he's not being defensive, but merely pointing out the reasons why AnandTech uses imperial measurements to a person who felt like it was Ryan's *duty* to spoon feed him metric measurements so he wouldn't have to do "calculations in the head" while the majority of the viewerbase would. Calling him "defensive" is the closest thing to "you mad bro" you can say. Reply
  • juhatus - Wednesday, December 25, 2013 - link

    Sorry, I was not trying to be smart ass about this. I bet the split for readers is 50% US and 50% international. I really tried to be as neutral as I can and no I don't help calculating.

    Ryan any comment? Is there any policy about standards?
  • bigboxes - Thursday, December 26, 2013 - link

    Sorry. It's not 50/50. I'm sure that it's >70%, but you've got me curious as to the real numbers. Almost everything on this site is for American buyers. You get the occasional "it's only available in Europe", but usually it's US-centric. So inches it is. You can get an android conversion app if you don't want to google it. Reply
  • bigboxes - Thursday, December 26, 2013 - link

    HA! I was wrong. US only 33.6% of traffic. D'oh! Reply
  • ShieTar - Wednesday, December 25, 2013 - link

    "the majority of the buyers of this level of video card are also in the U.S."

    I'm confused, how would you figure that? The European markets for consumer electronics have overtaken the North-American market in volume about a decade ago, and 2-3 years ago the combined volume of both have become smaller than the Asian market. Even with a relatively high-end card like this, I would be surprised if more than 25% of sales go to the U.S. these days.

    Also, why would a majority of AT readers be American? Funny measurement units and the price comparisons aside, there is nothing on here that is specifically tailored to the US market. As the majority of tech-savy Europeans is rather fluid in English, I'm sure AT gets just as many European readers, keeping in mind the much larger overall population.
  • TheJian - Saturday, December 28, 2013 - link

    I didn't realize Europe was a country...Silly me. This is a USA site, get over it. The site is based in NC, USA last I checked and am unaware of it having any server in another country. You're more than welcome to FUND the development of a FOREIGN soil based expansion.

    Does any single country buy more than your assumed 25% that goes to USA? Germany, Russia, France etc? You're counting europe as one entity and it's not. There is nothing wrong with catering to his OWN audience. If you want to read it from outside, fine. But don't expect anyone to alter their world to suit yours. It's just not this site's job. There are not many other web sites with multi language and I don't know of many USA sites using multi-language or even metric.

    With US traffic being 33%, there is no other larger country coming to read this site correct? I translate chinese/german tech pages all the time. Sure I wish they'd put them up in English for me, but realize that it is cost prohibitive to do so. No point in asking as I'd rather have that cost go to reviewing more products etc. That's what translators are for anyway :)
  • ShieTar - Monday, December 30, 2013 - link

    You seem to be a little confused about my statement. Try reading it again, and point out to me where I demanded that AT change anything? I merely pointed out some weak points in the comment of "wetwareinterface". I also did not pretend Europe was a country, merely an economic entity, which it is. I live in Europe, working for a European company, I can order products from any European shop without having to bother about any customs/tariff, I can move to and work in any other European Union country without any paperwork, etc. The different national governments at this point are merely another level of administration, but not something that seriously affects our everyday life.

    More importantly though, none of that matters to my original point, which was just to point out that an estimated 25% are a far way from "the majority", as the other 75% who come from "metric system nations" would be clearly the majority.
  • mpdugas - Thursday, December 26, 2013 - link

    Since number systems are purely imaginary, what difference does how you measure an object make?

    Some folks find fractions easy, some like decimals better, but they are just fractions, too.

    It really just comes down to which you prefer; the object does not change, no matter which fraction system you use.
  • xTRICKYxx - Tuesday, December 24, 2013 - link

    Wow, that cooler is something else. Reply
  • asH98 - Tuesday, December 24, 2013 - link

    I find it amazing how short sighted this writer is. It is obvious to anyone building a system AMD's 290 reference card gives the builder an opportunity to choose her/his GPU options, while allowing 290 suppliers an opportunity to make a few dollars and keep prices low - if the writer cant see something so obvious should I trust his articles? - Reply
  • blanarahul - Tuesday, December 24, 2013 - link

    Please rewrite your thoughts using clearer sentences. Reply
  • blanarahul - Tuesday, December 24, 2013 - link

    And proper punctuation. Reply
  • RaistlinZ - Tuesday, December 24, 2013 - link

    I have absolutely no idea what point you're trying to make. Reply
  • Sunburn74 - Wednesday, December 25, 2013 - link

    I too am completely lost as to your point. Reply
  • asH98 - Thursday, December 26, 2013 - link

    The 290 reference card with fan allows me my choice of cooling options, or perhaps a choice of AIB solutions -this it is still cheaper relative to comparable offerings vs performance - everyone wins !!
    All the bitching and moaning about noise is BS - the reference fan was always to keep prices low and allow options to builders, while AIB suppliers made a few dollars - ...!!
  • firewall597 - Thursday, December 26, 2013 - link

    I like how you automatically assume that everyone should be builders willing to take their GPU apart for aftermarket solutions. Reply
  • asH98 - Thursday, December 26, 2013 - link

    - if you're not a builder prices are still low via AIB's, and their solutions !! Reply
  • sparkuss - Tuesday, December 24, 2013 - link

    Just a quick double check. This is a pure reference board with no new components or change in location?

    I want to double confirm that I could buy this GPU now and be ready for some new Water Blocks for reference R290 when they become available (the WBs)?

  • Ryan Smith - Tuesday, December 24, 2013 - link

    Correct. Pure reference board with identical components and component positions, right down to the AMD logo.
  • Godigy - Wednesday, December 25, 2013 - link

    No, it follows the reference design, but Sapphire decided to put different chokes on. You'll see that Sapphire put on CEC R15 chokes, and the reference cards uses different chokes from the same brand. Reply
  • sparkuss - Wednesday, December 25, 2013 - link

    Looking at the picture they appear to be the same size though, don't they? Only a change in size or orientation should give a problem with water-blocks, I hope. Reply
  • ggathagan - Monday, December 30, 2013 - link

    What's the point of buying a pricier GPU with a custom cooler if you intend on replacing the cooler with water blocks? Reply
  • sparkuss - Monday, December 30, 2013 - link

    Basically because every time I think I'm ready to build my watercool rig, something happens and I don't follow through. But I still want to upgrade the GPU and I end up waiting too long before the reference cards are all gone. With this remaining reference I get all the benefit of the better cooling and can still hang on to it until I WC. Reply
  • shotgunx1x - Tuesday, December 24, 2013 - link

    I know it should be soon, but any idea when we can expect these to be available? Reply
  • Shreddie - Tuesday, December 24, 2013 - link

    Most likely Mid-Late January (that's the estimated DCII release date) Reply
  • Ryan Smith - Tuesday, December 24, 2013 - link

    Sapphire says it should start hitting shelves at the end of this week. Reply
  • Mopar63 - Wednesday, December 25, 2013 - link

    They seem to already be out with some etailers in Europe. Reply
  • Will Robinson - Wednesday, December 25, 2013 - link

    Guess Blackened23 and Balla are just gonna ignore the beating the 780 gets in this review lolol
    What was that about 780 overclocking again? :)
  • Mondozai - Wednesday, December 25, 2013 - link

    The OC performance here is better than the review in Hexus.

    Also, you can get from mid-800s to mid-1200s in mhz on a custom-cooled GTX 780 with OC. The OC here is a lesser amount. I wonder what the performance would be on a maxed out-OC'd GTX 780 would be vs the OC that Ryan applied.
  • Mondozai - Wednesday, December 25, 2013 - link

    ("That Ryan applied on this Sapphire 290", it should say at the end.) Reply
  • blanarahul - Wednesday, December 25, 2013 - link are liers. Look at their Noise charts. Tri-X 290 is louder than a 780 Ti. Look at Anandtech or, 290 Tri-X is quieter than 780 Ti. Reply
  • Mopar63 - Wednesday, December 25, 2013 - link

    calling them liars might be harsh. There is no standardization in the review industry when it comes to how noise level is measured. Some use open bench and others use cases. The cases can vary with each model and then there is the fact that hardware and software used to measure can vary as well. Add to this the measuring methodology such as distance from the item as well as how ti was set to run, did they measure 100% speed, game play speeds, auto or set and so on. Reply
  • Bobs_Your_Uncle - Sunday, December 29, 2013 - link

    Then there's always the question: are they recording the dB(A) measurements on a Metric scale or on an Imperial scale? : / Reply
  • bigboxes - Sunday, December 29, 2013 - link

    LOL Reply
  • sheh - Wednesday, December 25, 2013 - link

    I still don't understand why reference designs use blowers. Isn't that mainly useful for small cramped cases? Don't most users have a well-spaced case, and especially the users who go for high-end cards, and so are better served with an open air cooler? Reply
  • blanarahul - Wednesday, December 25, 2013 - link

    These open air coolers dump heat/hot "inside" the case/cabinet. This has a negative impact on CPU, RAM, Chipset, HDD etc. temperatures (usless you are using a super tower). Blowers throw the hot air out the case and hence has no impact on other components.

    Open air coolers make a lot of sense for Open Air test benches though.
  • Mopar63 - Wednesday, December 25, 2013 - link

    The impact of an open air cooler is a lot less on the systems internals than many think. The heat released is not a direct increase but rather like adding hot water to cooler water. The water is not suddenly hot, just a bit warmer. Also even mid and most mATX towers are moving more than enough air to make this a minor effect at best. In fact MANY of the mITX cases we are seeing for gaming rigs can handle it easily as well. Reply
  • Godigy - Wednesday, December 25, 2013 - link

    Not necessarily. I had an R9 290 with the Gelid Icy aftermarket cooler, and if I didn't take the side panel off of my case, it would really make stuff in the case hot. My CPU temps jumped 10C with the side panel closed. If I were to open the side panel, I could feel the heat coming out of my case, which has very good airflow. Reply
  • Mayuyu - Wednesday, December 25, 2013 - link

    I have one of these open air GPUs and I really don't like the design. The GPU board is so tall that it essentially creates another compartment in the case. Since the fan faces downward, all the heat is in the bottom of the case. The big fan at the back of the case only moves air for the CPU and RAM. Reply
  • skiboysteve - Thursday, December 26, 2013 - link

    I had that even worse. Card would crash in Diablo 3 from overheating. Case was super hot and GPU fan was pegged on high. I sold it (6850) and bought a GTX 660 because it had a blower, now its quiet and cool. Reply
  • Th-z - Wednesday, December 25, 2013 - link

    Thus the problem lies, people like small form factor these days. Open air cooler can complicate how the air is moved. Take the beta Steam Machine for example, the GPU is in a chamber that uses riser to reduce the size of the case. The design is simple for a blower cooler, the air basically moves one way all the way. If it's an open air cooler, both hot and cool air are intermingling, additional fans would need to help moving the air, and air path would also need a redesign.

    I agree with Ryan, AMD simply needs to do better job with their reference cooler, perhaps their AIB partners can step in and make their own blower cards that perform better than AMD's reference design.
  • Ryan Smith - Thursday, December 26, 2013 - link

    I had originally intended to put the following discussion about blowers versus open air coolers in the article, but it came off as too disjoint from the rest of the article so I dropped it. But since you’re asking, I’ll publish it here in the comments.


    When looking at the cooling performance of the 290 Tri-X OC relative to the reference 290, it’s important to keep in mind that the Tri-X OC’s cooling advantages don’t come for free; there are tradeoffs to be made for achieving this kind of performance. At the risk of sounding like a broken record, open air coolers can be very high performance solutions, however there are some important differences between open air coolers and blowers that need to be taken into consideration.

    Between the two types of coolers, blowers are the more compatible and more self-sufficient due to the fact that the blower design is essentially self-exhausting. By blowing hot air directly outside of the chassis, blowers aren’t significantly reliant on the chassis cooling, meaning they’ll work in a wide variety of cases and environments, especially small form factor designs or multi-GPU setups. The one downside to blowers is that the limited amount of space available to funnel air (about 1 PCIe slot’s width) requires that all of that airflow is generated by one fan, which in turn may have to run at a relatively high speed to move enough air. The end result being that while blowers don’t have to be loud, they’re generally louder than open air coolers.

    Open air coolers on the other hand essentially punt on the issue of cooling, focusing solely on removing heat from the GPU and related components, and making removal of that hot air the job of the chassis. This allows open air coolers to utilize numerous large, slow fans that can move a good deal of air without generating a lot of noise, but only a small portion of that air is exhausted outside the chassis by the open air cooler itself. The bulk of the work for removing heat from the chassis falls to the chassis itself, which can be beneficial as chassis fans are larger and quieter still, potentially making the combined solution a very quiet own.

    When it comes to open air coolers the drawback here two-fold. The first is simply that open air cards need breathing room; even though most cards are only two slots wide, the slot adjacent to the card needs to be kept complete open in order to permit airflow (even a small card like a sound card would still be an issue). The second drawback is that if the chassis can’t handle the heat load – and keep in mind that a single 290 under load is going to generate more heat than the rest of the system combined – then open air coolers will struggle to work well while at the same time the heat from the video card will have a run-on effect that makes it hard to properly cool the other components in the chassis.

    Because of the compatibility and self-reliance aspects of blowers, blowers are the coolers used on most high-end reference cards, as the design allows for the reference card design to be used in the widest range of systems. In that sense blowers represent a nice middle ground between functionality and noise, with a high quality blower capable of bringing all of that functionality without bringing too much noise. NVIDIA’s GTX Titan blower being a good example of what a blower is capable of at the high end, while the reference 290 is an unfortunate example of what a blower looks like if it’s struggling to keep up. On the other hand an open air cooler can scale up better while still maintaining very low noise levels – as exemplified by cards like the 290 Tri-X OC and the Radeon HD 7990 – but the compatibility issues mean that the resulting cards can’t be used as in wide a range of systems, something that can be problematic for reference cards.

    In the end however there’s a need for both kinds of coolers to be on the market. As neither style is without its flaws, having the two vastly different designs allows for a wider range of market coverage than what either cooler alone could accomplish.
  • Jwboo65 - Monday, December 30, 2013 - link

    Typo in the third paragraph. The very last word. Nice article. Thanks Ryan! Reply
  • Wade_Jensen - Wednesday, December 25, 2013 - link

    Would someone be willing to explain binning? I hear Anand and Ryan and Ian talking about it surrounding CPU and GPUs scaling but its never explained. But how is it accomplished and/or caused in manufacturing? Is that how intel has differing performance in an i5 and i7 of identical tdp, HT aside?

    Yes, I've tried Google. :p
  • sheh - Wednesday, December 25, 2013 - link

    Take a million chips, run them through test equipment to collect data on voltage, heat, max frequency, functionality of cores/subunits. Sort them according to the results, do any external tweaking if needed (e.g., I think old CPUs has things like resistors on the package to disable/limit features), print the correct model number/put on the sticker, sell. Reply
  • Sunburn74 - Wednesday, December 25, 2013 - link

    Essentially, if you manufacture 100 chips with target level of perfomance X, some will exceed target level of performance X, some will just barely reach it, and some will underperform and not quite reach the target.

    As the supplier, you bin/plan to sell the overperformers as your highest level product (ie gtx 9000 OC uber ultra TI lightning thor odin card), your on target performers as some mid range product (gtx 8950), and your poor performers as some lower range product (gtx 7000 energy sipper)

    Its more complex than that in the real world as intel/nvidia/amd offer a large multitude of products, but in a nutshell that's binning.
  • Sunburn74 - Wednesday, December 25, 2013 - link

    BTW, this also explains why the best overclocking products generally tend to be on the higher end of the product spectrum Reply
  • Wade_Jensen - Wednesday, December 25, 2013 - link

    Thanks guys! :) Reply
  • gonks - Wednesday, December 25, 2013 - link

    Ryan, there's a typo on the gaming charts and oc charts, on the headers says "Maxium quality".
    Great review btw!
  • Ryan Smith - Thursday, December 26, 2013 - link

    Huh. I had thought I had fixed that. Thanks! Reply
  • Duelix - Thursday, December 26, 2013 - link

    Can't wait for a Direct CU II model from Asus. A R9 290 with a decent cooler is stupid fast for the money. Take that, Nvidia! Reply
  • hoboville - Thursday, December 26, 2013 - link

    It's really amazing how on some games the 780 Ti is 10-20% faster, and then slower in others. If that performance were consistent across all game titles, the 780 Ti might just be worth the price.

    Then again, $700 card vs $450 card means it costs more than 50% more! What can you say then other than: if you have the budget for a 780 Ti, save another $250 bucks and get a 2nd 290.
  • Brent20 - Tuesday, December 31, 2013 - link

    The problem with that thinking is the AMD cards do NOT sell for $450, ANYWHERE. They are selling for 50% OVER retail price. Reply
  • r13j13r13 - Thursday, December 26, 2013 - link

    me orgasmeoooooooooo pero con la con la version 290x iguala a la 780 ti Reply
  • randinspace - Friday, December 27, 2013 - link

    "Sapphire is essentially charging $50 for a better cooler..."

    It's worth noting that what with the voiding of the warranty that follows installing an aftermarket cooling solution on a reference card the $50 premium Sapphire is charging is probably worth it in the long run.
  • toyotabedzrock - Friday, December 27, 2013 - link

    The almost perfect scaling of performance with the over clock is interesting. If AMD would allow them to do the same with the 290x it would allow them to outright beat the 780 ti. Reply
  • ggathagan - Monday, December 30, 2013 - link

    An important note for owners of cases that rotate the motherboard 90 degrees (Silverstone Fortress 2, etc...):
    The Asus DirectCU, the HIS IceQ X² and the MSI FrozrII lines orient the heatsink fins in a way that can take advantage of the bottom-to-top airflow of those cases.
    Custom cooled solutions from Sapphire, EVGA, XFX, Gigabyte and MSI all orient the cooling fins of the heatsink perpendicular to the airflow in such cases.
  • Mopar63 - Tuesday, December 31, 2013 - link

    This sounds like it might matter but it does not. The fins are oriented so they are essentially across the airflow of any case design. The air flow however is not enough to be an issue against the direct air pressure of the GPU fans and the exhaust is then whisked away by the air flow currents of the case.

    If there was ANY difference in temps it might be 1 or 2 C at the MOST.
  • Brent20 - Tuesday, December 31, 2013 - link

    If you want the review to be fair, I think its time that all these AMD reviews reflect the REAL inflated price that the card sells for when comparing it to the competition, which is about 50% over retail value.

    AMD will die off as a gaming card company as long as they continue to cater to the "mining" users. Mark my words. Why buy a card for 50% over retail, and a card that is rarely in stock anywhere.

    They need to put something in to block mining use. They may make big bucks off it now, but in the long run its going to hurt their "gaming" business badly.

    Perhaps they could block the mining in the gaming card and make a separate card for mining and sell it at quadruple the price. As the miners appear willing to pay any price for it.
  • teiva - Thursday, January 02, 2014 - link

    The prices in Australia haven't fluctuated and stayed pretty well the same since their introduction. Reply
  • Mopar63 - Thursday, January 02, 2014 - link

    Brent20 this shows a lack of understand of how things work. AMD is not catering to miners. The particular way the mining software works just happens to work better on AMD's GPU design than on NVidia. This was not done with miners in mind it was the design they chose BEFORE anyone knew miners existed.

    If they put anything in the "block" miners that would just be wrong. AMD sells APUs, they do not care who buys them, only that some one buys them, NVidia does this the same way. Blocking a potential user is bad business and in this case may hurt the GPU in other areas of performance.
  • FookDuSushi - Monday, January 06, 2014 - link

    If they already have a ready one's why don't they release it to the public yet?? Reply
  • capawesome9870 - Saturday, January 11, 2014 - link

    do you have any 4k tests? similar to the Low-Medium on the original R9 290 and 290x reviews. Reply
  • boozzer - Wednesday, January 22, 2014 - link

    will there is a 290x version? this seems to be the cooler to get. please, please do a 290x tri x oc review + oc review!!!!!!!!!!! please. Reply
  • Muckster - Friday, February 28, 2014 - link

    How would this card compare to the MSI GTX 780 ti in terms of noise and temp? Techpowerup has an article on the GTX but I can't seem to find an apples to apples comparison. As with the Sapphire to the stock R9 290, the MSI improved significantly on the noise level of the stock GTX 789 ti.
  • kiddo - Monday, March 03, 2014 - link

    is this good ? cause my bro bought it for me, I`m 5yr :3 Reply
  • P39Airacobra - Tuesday, January 13, 2015 - link

    over 500watts? That must be a error! The dam thing has a 275watt TDP, So at the most it should only go a litte past 400 watts maybe but that's it! Reply
  • driessen9 - Friday, March 06, 2015 - link

    what program did you use for the overclocking? Reply

Log in

Don't have an account? Sign up now