Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
Comments Locked

396 Comments

View All Comments

  • Da W - Thursday, October 24, 2013 - link

    The reference cooler is noisy as hell, but it's a blower. At least it doesn't dump all the heat inside your case and let your other case fans handle it. It depends what you're looking for.

    Still make as much noise as the 5870 did, and it was a commercial success.
  • slickr - Thursday, October 24, 2013 - link

    Didn't think AMD will deliver, in fact I thought from seeing some initial benchmarks that AMD took over 6 months just to deliver a graphic card slower than Titan and that even with a cheap price it wouldn't be enough, boy was I wrong.

    This card beats Titan in so many games and in so many resolutions and is almost $500 cheaper, its also $100 cheaper than the GTX 780 and anywhere from 5% to 20% faster than the 780, that is just amazing.

    Hopefully this trickles down to the medium range cards and we are going to see cards like the 280x go for less than $250.

    I mean unless Nvidia positiones the Titan at $550 as well, then I don't think it will sell very much at all. In the 290x you have a better performing card at almost half the price, Nvidia has its work cut out for them and I sure hope the 780 TI edition really brings in the performance and price as well.
  • eanazag - Thursday, October 24, 2013 - link

    I'm sporting a Nvidia GPU in my rig. I don't see any option for Nvidia than to reduce both the Titan, 770, and 780 in cost. I can't expect the 780 Ti performance to trump the Titan. I will say that there is power and cooling room for Nvidia to ratchet things up and make this interesting. This is bold move on AMDs part and does wonders for consumers. Based on some of the other comments current news does not kill off either brand by the way. PC gaming and desktops are not dead.....
  • kwrzesien - Thursday, October 24, 2013 - link

    Ryan, can we get a pipeline article or retweet this article when it is complete? Thanks!
  • spiked_mistborn - Thursday, October 24, 2013 - link

    Nice job AMD! Competition is good! Also, feel free to use my GSYNC idea about putting a frame buffer in the display and letting the video card control the refresh rate. This post is from March 2013. Apparently adding a dash to make it G-Sync makes it different somehow. http://techreport.com/discussion/24553/inside-the-...
  • Sorodsam - Thursday, October 24, 2013 - link

    I'm surprised no one's commented on the new "AMD Center", or this troubling text:

    "You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD."

    There's a big difference between a site that runs an occasional AMD ad and a site with an entire section that's expressively "sponsored by AMD", especially considering AnandTech's (former?) guiding principle that product reviewers shouldn't be aware of who exactly is buying advertising and when. They can hardly be unaware of it now.
  • MrMaestro - Thursday, October 24, 2013 - link

    The reason people aren't commenting on it is because it's already been commented on. Take a look at the article announcing AMD Centre. That is the appropriate place for such comments.
  • anevmann - Thursday, October 24, 2013 - link

    Ryan, will this card require PCIe 3.0 for gaming?

    Can you do a test with and without PCIe 3.0? I really want this card, but I wanna know in advance if I have to upgrade my system.

    A
  • Ryan Smith - Monday, October 28, 2013 - link

    I can't promise when it will be done (given the insanity of our schedule over the next 3 weeks), but at some point we will follow this up with a reprise article, that among other things will cover PCIe bandwidth vs. Crossfire scaling, CF testing in quiet mode, and some noise equalization to see what fan levels it would take to match a GTX 780 and what the resulting performance would look like.

    Anyhow, for a single card setup none of my data thus far supports PCIe 3.0 being a requirement. We're not to the point yet where PCIe 2.0 x16 is a general gaming bottleneck.
  • Hung_Low - Thursday, October 24, 2013 - link

    Is 290X really the absolute maxed out version Tahiti? Or did it also leave a lot of room like Titan is for GK110.
    Perhaps the GPUs used in 290x are those Tahiti's with imperfection, with the high quality Tahiti's saved for extreme edition 290x/290x+ ?

Log in

Don't have an account? Sign up now