Meet The Radeon R9 290X

Now that we’ve had a chance to discuss the features and the architecture of GCN 1.1 and Hawaii, we can finally get to the hardware itself: AMD’s reference Radeon R9 290X.

Other than the underlying GPU and the livery, the reference 290X is actually not a significant deviation from the reference design for the 7970. There are some changes that we’ll go over, but for better and for worse AMD’s reference design is not much different from the $550 card we saw almost 2 years ago. For cooling in particular this means AMD is delivering a workable cooler, but it’s not one that’s going to complete with the efficient-yet-extravagant coolers found on NVIDIA’s GTX 700 series.

Starting as always from the top, the 290X measures in at 10.95”. The PCB itself is a bit shorter at 10.5”, but like the 7970 the metal frame/baseplate that is affixed to the board adds a bit of length to the complete card. Meanwhile AMD’s shroud sports a new design, one which is shared across the 200 series. Functionally it’s identical to the 7970, being made of similar material and ventilating in the same manner.

Flipping over to the back of the card quickly, you won’t find much here. AMD has placed all 16 RAM modules on the front of the PCB, so the back of the PCB is composed of resistors, pins, mounting brackets, and little else. AMD continues to go without a backplate here as the backplate is physically unnecessary and takes up valuable breathing room in Crossfire configurations.

Pulling off the top of the shroud, we can see in full detail AMD’s cooling assembling, including the heatsink, radial fan, and the metal baseplate. Other than angling the far side of the heatsink, this heatsink is essentially unchanged from the one on the 7970. AMD is still using a covered aluminum block heatsink designed specifically for use in blower designs, which runs most of the length of the card between the fan and PCIe bracket. Connecting the heatsink to the GPU is an equally large vapor chamber cooler, which is in turn mounted to the GPU using AMD’s screen printed, high performance phase change TIM. Meanwhile the radial fan providing airflow is the same 75mm diameter fan we first saw in the 7970. Consequently the total heat capacity of this cooler will be similar, but not identical to the one on the 7970; with AMD running the 290X at a hotter 95C versus the 80C average of the 7970, this same cooler is actually able to move more heat despite being otherwise no more advanced.


Top: 290X. Bottom: 7970

Moving on, though we aren’t able to take apart the card for pictures (we need it intact for future articles), we wanted to quickly go over the power and RAM specs for the 290X. For power delivery AMD is using a traditional 5+1 power phase setup, with power delivery being driven by their newly acquired IR 3567B controller. This will be plenty to drive the card at stock, but hardcore overclockers looking to attach the card to water or other exotic cooling will likely want to wait for something with a more robust power delivery system. Meanwhile despite the 5GHz memory clockspeed for the 290X, AMD has actually equipped the card with everyone’s favorite 6GHZ Hynix R0C modules, so memory controller willing there should be quite a bit of memory overclocking headroom to play with. 16 of these modules are located around the GPU on the front side of the PCB, with thermal pads connecting them to the metal baseplate for cooling.

Perhaps the biggest change for the 290X as opposed to the 7970 is AMD’s choice for balancing display connectivity versus ventilation. With the 6970 AMD used a half-slot vent to fit a full range of DVI, HDMI, and DisplayPorts, only to drop the second DVI port on the 7970 and thereby utilize a full slot vent. With the 290X AMD has gone back once more to a stacked DVI configuration, which means the vent is once more back down to a bit over have a slot in size. At this point both AMD and NVIDIA have successfully shipped half-slot vent cards at very high TDPs, so we’re not the least bit surprised that AMD has picked display connectivity over ventilation, as a half-slot vent is proving to be plenty capable in these blower designs. Furthermore based on NVIDIA and AMD’s latest designs we wouldn’t expect to see full size vents return for these single-GPU blowers in the future, at least not until someone finally gets rid of space-hogging DVI ports entirely.

Top: R9 290X. Bottom: 7970

With that in mind, the display connectivity for the 290X utilizes AMD’s new reference design of 2x DL-DVI-D, 1x HDMI, and 1x DisplayPort. Compared to the 7970 AMD has dropped the two Mini DisplayPorts for a single full-size DisplayPort, and brought back the second DVI port. Note that unlike some of AMD’s more recent cards these are both physically and electrically DL-DVI ports, so the card can drive 2 DL-DVI monitors out of the box; the second DVI port isn’t just for show. The single DVI port on the 7970 coupled with the high cost of DisplayPort to DL-DVI ports made the single DVI port on the 7970 an unpopular choice in some corners of the world, so this change should make DVI users happy, particularly those splurging on the popular and cheap 2560x1440 Korean IPS monitors (the cheapest of which lack anything but DVI).

But as a compromise of this design – specifically, making the second DVI port full DL-DVI – AMD had to give up the second DisplayPort, which is why the full sized DisplayPort is back. This does mean that compared to the 7970 the 290X has lost some degree of display flexibility howwever, as DisplayPorts allow for both multi-monitor setups via MST and for easy conversion to other port types via DVI/HDMI/VGA adapters. With this configuration it’s not possible to drive 6 fully independent monitors on the 290X; the DisplayPort will get you 3, and the DVI/HDMI ports the other 3, but due to the clock generator limits on the 200 series the 3 monitors on the DVI/HDMI ports must be timing-identical, precluding them from being fully independent. On the other hand this means that the PC graphics card industry has effectively settled the matter of DisplayPort versus Mini DisplayPort, with DisplayPort winning by now being the port style of choice for both AMD and NVIDIA. It’s not how we wanted this to end up – we still prefer Mini DisplayPort as it’s equally capable but smaller – but at least we’ll now have consistency between AMD and NVIDIA.

Moving on, AMD’s dual BIOS functionality is back once again for the 290X, and this time it has a very explicit purpose. The 290X will ship with two BIOSes, a “quiet” bios and an “uber” BIOS, selectable with the card’s BIOS switch. The difference between the two BIOSes is that the quiet BIOS ships with a maximum fan speed of 40%, while the uber BIOS ships with a maximum fan speed of 50%. The quiet BIOS is the default BIOS for the 290X, and based on our testing will hold the noise levels of the card equal to or less than those of the reference 7970.

AMD Radeon Family Cooler Comparison: Noise & Power
Card Load Noise - Gaming Estimated TDP
Radeon HD 7970 53.5dB 250W
Radeon R9 290X Quiet 53.3dB 300W
Radeon R9 290X Uber 58.9dB 300W

However because of the high power consumption and heat generation of the underlying Hawaii GPU, in quiet mode the card is unable to sustain its full 1000MHz boost clock for more than a few minutes; there simply isn’t enough cooling occuring at 40% to move 300W of heat. We’ll look at power, temp, and noise in full a bit later in our benchmark section, but average sustained clockspeeds are closer to 900MHz in quiet mode. Uber mode and its 55% fan speed on the other hand is fast enough (and just so) to move enough air to keep the card at 1000MHz in all non-TDP limited workloads. The tradeoff there is that the last 100MHz of clockspeed is going to be incredibly costly from a noise perspective, as we’ll see. The reference 290X would not have been a viable product if it didn’t ship with quiet mode as the default BIOS.

Finally, let’s wrap things up by talking about miscellaneous power and data connectors. With AMD having gone with bridgeless (XDMA) Crossfire for the 290X, the Crossfire connectors that have adorned high-end AMD cards for years are now gone. Other than the BIOS switch, the only thing you will find at the top of the card are the traditional PCIe power sockets. AMD is using the traditional 6pin + 8pin setup here, which combined with the PCIe slot power is good for delivering 300W to the card, which is what we estimate to be the card’s TDP limit. Consequently overclocking boards are all but sure to go the 8pin + 8pin route once those eventually arrive.

PowerTune: Improved Flexibility & Fan Speed Throttling A Note On Crossfire, 4K Compatibility, Power, & The Test
Comments Locked

396 Comments

View All Comments

  • Sandcat - Friday, October 25, 2013 - link

    That depends on what you define as 'acceptable frame rates'. Yeah, you do need a $500 card if you have a high refresh rate monitor and use it for 3d games, or just improved smoothness in non-3d games. A single 780 with my brothers' 144hz Asus monitor is required to get ~90 fps (i7-930 @ 4.0) in BF3 on Ultra with MSAA.

    The 290x almost requires liduid...the noise is offensive. Kudos to those with the equipment, but really, AMD cheaped out on the cooler in order to hit the price point. Good move, imho, but too loud for me.
  • hoboville - Thursday, October 24, 2013 - link

    Yup, and it's hot. It will be worth buying once the manufacturers can add their own coolers and heat pipes.

    AMD has always been slower at lower res, but better in the 3x1080p to 6x1080p arena. They have always aimed for high-bandwidth memory, which is always performs better at high res. This is good for you as a buyer because it means you'll get better scaling at high res. It's essentially forward-looking tech, which is good for those who will be upgrading monitors in the new few years when 1440p IPS starts to be more affordable. At low res the bottleneck isn't RAM, but computer power. Regardless, buying a Titan / 780 / 290X for anything less than 1440p is silly, you'll be way past the 60-70 fps human eye limit anyway.
  • eddieveenstra - Sunday, October 27, 2013 - link

    Maybe 60-70fps is the limit. but at 120Hz 60FPS will give noticable lag. 75 is about the minimum. That or i'm having eagle eyes. The 780gtx still dips in the low framerates at 120Hz (1920x1080). So the whole debate about titan or 780 being overkill @1080P is just nonsense. (780gtx 120Hz gamer here)
  • hoboville - Sunday, October 27, 2013 - link

    That really depends a lot on your monitor. When they talked about Gsync and frame lag and smoothness, they mentioned when FPS doesn't exactly match the refresh rate you get latency and bad frame timing. That you have this problem with a 120 Hz monitor is no surprise as at anything less than 120 FPS you'll see some form of stuttering. When we talk about FPS > refresh rate then you won't notice this. At home I use a 2048x1152 @ 60 Hz and beyond 60 FPS all the extra frames are dropped, where as in your case you'll have some frames "hang" when you are getting less than 120 FPS, because the frames have to "sit" on the screen for an interval until the next one is displayed. This appears to be stuttering, and you need to get a higher FPS from the game in order for the frame delivery to appear smoother. This is because apparent delay decreases as a ratio of [delivered frames (FPS) / monitor refresh speed]. Once the ratio is small enough, you can no longer detect apparent delay. In essence 120 Hz was a bad idea, unless you get Gsync (which means a new monitor).

    Get a good 1440p IPS at 60 Hz and you won't have that problem, and the image fidelity will make you wonder why you ever bought a monitor with 56% of 1440p pixels in the first place...
  • eddieveenstra - Sunday, October 27, 2013 - link

    To be honnest. I would never think about going back to 60Hz. I love 120Hz but don't know a thing about IPS monitors. Thanks for the response....

    Just checked it and that sounds good. When becoming more affordable i will start thinking about that. Seems like the IPS monitors are better with colors and have less blur@60Hz than TN. link:http://en.wikipedia.org/wiki/IPS_panel
  • Spunjji - Friday, October 25, 2013 - link

    Step 1) Take data irrespective of different collection methods.

    Step 2) Perform average of data.

    Step 3) Completely useless results!

    Congratulations, sir; you have broken Science.
  • nutingut - Saturday, October 26, 2013 - link

    But who cares if you can play at 90 vs 100 fps?
  • MousE007 - Thursday, October 24, 2013 - link

    Very true, but remember, the only reason nvidia prices their cards where they are is because they could. (Eg Intel CPUs v AMD) Having said that, I truly welcome the competition as it makes it better for all of us, regardless of which side of the fence you sit.
  • valkyrie743 - Thursday, October 24, 2013 - link

    the card runs at 95C and sucks power like no tomorrow. only only beats the 780 by a very little. does not overclock well.

    http://www.youtube.com/watch?v=-lZ3Z6Niir4
    and
    http://www.youtube.com/watch?v=3OHKWMgBhvA

    http://www.overclock3d.net/reviews/gpu_displays/am...

    i like his review. its pure honest and shows the facts. im not a nvidia fanboy nore am i a amd fanboy. but ill take nvidia right how over amd.

    i do like how this card is priced and the performance for the price. makes the titan not worth 1000 bucks (or the 850 bucks it goes used on forums) but as for the 780. if you get a non reference 780. it will be faster than the 290x and put out LESS heat and LESS noise. as well as use less power.

    plus gtx 780 TI is coming out in mid November which will probably cut the cost of the current 780 too 550 and and this card would be probably aorund 600 and beat this card even more.
  • jljaynes - Friday, October 25, 2013 - link

    you say the review sticks with the facts - he starts off talking about how ugly the card is so it needs to beat a titan. and then the next sentence he says the R9-290X will cost $699.

    he sure seems to stick with the facts.

Log in

Don't have an account? Sign up now