Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
POST A COMMENT

396 Comments

View All Comments

  • TheJian - Friday, October 25, 2013 - link

    Wrong, Zotac price in cart $624. :) Personally I'd buy an OC card for $650 but that's just me.
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Reply
  • 46andtool - Thursday, October 24, 2013 - link

    your comment makes no sense, all I see are excuses and misinformation in your post." It doesn't cost less than a GTX780, it only has a lower MSRP." is just stupid, battlefield 4 edition 290xs are already on newegg for $579, the only cheap 780gtxs you will find will be used ones. Reply
  • chrnochime - Thursday, October 24, 2013 - link

    What 549? Every 780 on NE goes for 649. I want some of the kool-aid you're drinking. Reply
  • HisDivineOrder - Friday, October 25, 2013 - link

    It IS loud. HardOCP have a tendency to be so "hard" they ignore the volume of the card. They aren't the most reliant of sites about the acoustics of a card. Not in the past and not today. Reply
  • JDG1980 - Thursday, October 24, 2013 - link

    Regarding 1080p performance, so what? You don't need a $500+ video card to get acceptable frame rates at that resolution. A $200-$300 card will do just fine. $500+ video cards are for multi-monitor setups or high resolution (1440p+) displays.
    Regarding the noise, that's a problem - AMD clearly stretched things as far as they could go with GCN to reach the current performance level. I know that EK has already announced a 290X waterblock for those enthusiasts who use custom loops. I wouldn't be surprised to see someone come out with a self-contained closed-loop watercooler for the 290X, similar to those that have been available for CPUs for a couple years now. That might help fix the noise issues, especially if it used a dual 120mm/140mm radiator.
    Reply
  • 46andtool - Thursday, October 24, 2013 - link

    we are just now breaking 60fps on 1080p on demanding games at max details, and even more demanding games are just around the corner so your telling people what exactly? And everybody knows AMD makes retarded reference coolers. So another moot point. Lets-try-and -discredit- AMDs- stellar -new product -anyway -we -can- but- the- only- way- we -know -how -is -by -grasping- at- straws. Reply
  • inighthawki - Thursday, October 24, 2013 - link

    BS, there's absolutely nothing wrong with a high end card on a 1080p display. Just look at the benchmarks, Crysis 3 1080p on high, a 7970GE barely hits 60fps, and no doubt that will drop below 60 on many occasions (it's just an average). On top of that, not all games are nearly as well optimized as Crytek games, or are just far more complex. Total War: Rome 2, even the 290X doesn't barely hits 60fps on extreme with MEDIUM shadows. Or maybe look at Company of Heroes 2, and how even the 290X hits a min fps of 37fps on extreme.

    On top of all of that, high resolution IPS panels are super expensive, not everyone cares enough about that to spend the money. The difference between a quality 1080p and a quality 1440p panel can be almost as much as the video card itself.
    Reply
  • patrioteagle07 - Thursday, October 24, 2013 - link

    Not really... You can find refurbed ZR30s for under $600
    If you are going to spend 1k on gfx its rather short sighted to keep your TN panels...
    Reply
  • inighthawki - Thursday, October 24, 2013 - link

    That's at LEAST several hundred dollars more than the majority of people are willing to spend on a monitor. 1080p TN panels are fine for most people, including most gamers. What people care about is not monitor count, pixel count, or color accuracy. They want high quality shaded pixels and good framerate. This is where high end video cards on smaller monitors comes into play. There are plenty of reasons to do it. Do not confuse your own values as the same as what everyone else wants. Reply
  • ShieTar - Friday, October 25, 2013 - link

    Also, an increasing number of players is considering 120 FPS to be the acceptable framerate, not 60FPS. Reply

Log in

Don't have an account? Sign up now