Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
Comments Locked

396 Comments

View All Comments

  • itchyartist - Thursday, October 24, 2013 - link

    Incredible performance and value from AMD!

    The fastest single chip video card in the world. Overall it is faster than the nvidia Titan and only $549! Almost half the price!

    Truly great to see the best performance around at a cost that is not bending you over. Battlefield 4 with AMD Mantle just around the corner. These new 290X GPUs are going to be uncontested Kings of the Hill for the Battlefield 4 game. Free battlefield game with the 290X too.Must buy.

    Incredible!
  • Berzerker7 - Thursday, October 24, 2013 - link

    ...really? The card is $600. You reek of AMD PR.
  • Novulux - Thursday, October 24, 2013 - link

    It says $549 in this very review?
  • Berzerker7 - Thursday, October 24, 2013 - link

    It does indeed. His article still smells like pre-written script.
  • siliconwizard - Thursday, October 24, 2013 - link

    Like all the reviews state GTX Titan is now irrelevant. 290X took the crown and saved the wallet.
  • siliconwizard - Thursday, October 24, 2013 - link

    Thinking that sphere toucher' s comment is accurate. Bit of salt here over amd taking over the high end slot and ridiculing the titan card. Only going to get worse once the Mantle enabled games are rleased. Nvidia is finished for battlefield 4. Crushed by amd, 290x and mantle.
  • MousE007 - Thursday, October 24, 2013 - link

    Mantle.....lol , nvidia Gsync just killed AMD
  • ninjaquick - Thursday, October 24, 2013 - link

    lol? a G-Sync type solution is a good candidate for being integrated into a VESA standard, and make it part of the Display's Information that is exchanged though DP/HDMI/DVI, so all AMD would need to do is make sure their drivers are aware that they can send frames to the screen as soon as they are finished. The best part would be that, with the whole Mantle deal, AMD would probably expose this to the developer, allowing them to determine when frames are 'G-Sync'd' and when they are not.
  • MousE007 - Thursday, October 24, 2013 - link

    No, there is a "hand- shake" between GPU and the monitor or tv, will not be supported with any other brand.
  • inighthawki - Thursday, October 24, 2013 - link

    You do realize that it can still be put into the VESA standard, right? Then only GPUs supporting the standard can take advantage of it. Also ANYONE who believes that GSync OR Mantle is going to "kill the other" is just an idiot.

Log in

Don't have an account? Sign up now