Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
Comments Locked

396 Comments

View All Comments

  • pattycake0147 - Friday, October 25, 2013 - link

    Nope piroroadkill is spot on with speaking his opinion. Anand continually asks for reader feedback, and he's doing just that.

    The rate at which this article is being finished is piss poor. Ryan said it would be finished in the morning the day of posting which meant in the next 12 hr or so. The main explanatory pages took about 24 hr to be completely fleshed out, and the graphs still don't have any text explaining the trends in performance. I actually value the author's commentary more than the graphs, and looking through a review which is incomplete over 36 hr after posting is much below Anandtech standards.

    I hate to bring it up because I like reading the vast majority of content on Anandtech regardless of market or complany, but I firmly believe piroroadkill is correct in saying that a new Apple product would have had a complete and thorough review shortly after NDA was lifted.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He had three R9 290X's in one system. Crossing his chest, he took out his third and slid its PCIe into the test bed. Immediately, the room began to darken and a voice spoketh, "You dare install THREE R9 290X's into one system! You hath incurred the wrath of The Fixer, demon lord of the 9.5th circle of hell! Prepare for the doooooom!"

    Then the system erupted into flames, exploding outward with rapid napalm-like flames that sent him screaming out the door. Within seconds, the entire building was burning and within minutes there was nothing left but ashes and regrets.

    Ever since, he has been locked away in a mental health ward, scribbling on a notepad, "Crossfire," over and over. Some say on the darkest nights, he even dares to whisper a single phrase, "Three-way."
  • B3an - Saturday, October 26, 2013 - link

    LOL!
  • Ryan Smith - Monday, October 28, 2013 - link

    Hahaha!

    Thanks man, I needed that.
  • yacoub35 - Friday, October 25, 2013 - link

    It's a bit silly to list the 7970 as $549 when the truth is they can be had for as little as $200. And they're easily the best deal for a GPU these days.
  • yacoub35 - Friday, October 25, 2013 - link

    To clarify: A marketing piece lists "Launch prices", a proper review compares real-world prices.
  • yacoub35 - Friday, October 25, 2013 - link

    So double the ROPs on a new architecture and an extra GB of faster GDDDR results in maybe 10-20 more frames than a 7970GE at the resolution most of us run (1920x). Somehow I don't think that's worth twice the price, let alone the full $549 for someone who already owns a 7970.
  • Jumangi - Friday, October 25, 2013 - link

    Only a clueless noob with too much money in their pocket would buy a 290x if they are running at 1920 resolution.
  • kyuu - Friday, October 25, 2013 - link

    If you're just looking to game at high details on a single 1080p monitor, then no, the 290X isn't interesting as you're spending a lot of money for power you don't need. If you're gaming at 1440p or higher and/or using Eyefinity, then it's a different story.
  • Hulk - Friday, October 25, 2013 - link

    I just wanted to thank Ryan for getting up the charts before the rest of the article. We could have either waited for the entire article or gotten the performance charts as soon as you completed them and then the text later. Thanks for thinking of us and not holding back the performance data until the article was finished. It's exactly that type of thinking that makes this site the best. I can imagine you starting to work on the text and thinking, "You know what? I have the performance data so why don't I post it instead of holding it back until the entire article is finished."

    Well done as usual.

Log in

Don't have an account? Sign up now