Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
Comments Locked

396 Comments

View All Comments

  • DMCalloway - Thursday, October 24, 2013 - link

    Once again, against the Titan it's $450 cheaper, not $100. Against the gtx 780 it is a wash on performance at a cheaper price point. Eight months late to the game I'll agree on, however it took time to get in bed with Sony and Micro$oft which was needed if they (AMD) ever hope to get to the point of being able to release 'at a competitive time'. I'm amazed that they are still viable after the financial losses they suffered with the whole Intel paying OEM's to not release their current cpu gen. along side AMD's business . Sure, AMD won the law suit but the financial losses in market share was in the billions , Intel jumped ahead a gen. and the damage was done. Realistically, I believe AMD chose wisely to focus on the console market because the 7970ghz pushed hard wasn't really that far behind a stock gtx780.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Ever wonder why the TItan costs $350 more than their own GTX 780 while having only a small margin of improvement?

    Oh, right, compute performance.
  • anubis44 - Thursday, October 24, 2013 - link

    and in some cases, the R9 290X is as much as 23% faster in 4K resolution than the Titan, or in the words of HardOCP: : "at Ultra HD 4K it (R9 290X) just owns the GeForce GTX TITAN."
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Once again, Titan is a gaming/workstation hybrid, that's why it costs $350 more than their own GTX 780 with only a small FPS improvement in gaming.
  • TheJian - Friday, October 25, 2013 - link

    Depends on the games chosen. For instance All 4K:
    Guru3d:
    Tombraider tied 4k 40fps (they consider this BARELY playable-though advise 60fps)
    MOH Warfighter Titan wins 7%
    Bioshock Infinite Titan wins 10% (33fps to 30, but again not going to be playable min in teens?)
    BF3 TIE (32fps, again avg, so not playable)
    The only victory at 4K is Hitman absolution here. So clearly it depends on what your settings are and what games you play. Also note the fps at 4K at hardocp. They can't max settings and every game is a sacrifice of some stuff (or a lot). Even at 2560 Kyle notes all were unplayable with everything on with avg's at 22fps and mins 12fps for all 3 basically...ROFL. How useful is it to win (or even lose) at a res you can't play at?

    http://www.techpowerup.com/reviews/AMD/R9_290X/24....
    Techpowerup tests all the way to 5760x1080, quoting that unless not tested. Here we go again...LOL
    World of Warcraft domination for 780 & Titan (over 20% faster on titan 5760!)
    SKYRIM - Both Titan and 780 win 5760
    Starcraft2 only went to 2560 but again clean sweep for 780/Titan bot over 10%
    Splintercell blacklist clean sweep at 2560 & 5760 for Titan AND 780 (>20% for titan both res)
    Farcry3 (titan and 780 wins at 5760 but at 22fps who cares...LOL but 10% faster than 290x)
    black ops 2 (only went to 2560, but titan wins all res)
    Metro TIE (26fps, again neither playable)
    Crysis 3 Titan over 10% win (25fps vs. 22, but neither playable...LOL)

    At hardocp, metro, tombraider, bf3, and crysis 3 were all UNDER 25fps min on both cards with most coming in at 22fps or so on both. I wish they would benchmark at what they find is PLAYABLE, but even then I'm against 4K if I have to turn all kinds of stuff off in the first place. Only farcry3 was tested at above 30fps...LOL. You need TWO cards for 4K gaming. PERIOD. If you have the money to buy a 4K monitor or two monitors you probably have the cash to do it right and buy 2 cards. Steampowered survey shows this as most have 2 cards above 1920x1200! Bragging about 4K gaming on this card (or even titan) is ridiculous as it just ends up in an exercise of turning crap off that devs wanted me to SEE. I wouldn't care if 290x was 50% faster than Titan if you're running 22fps who cares? Neither is playable. You've proven NOTHING. If we jump off a 100 story building I'll beat you to the bottom...Yeah but umm...We're both still dead right? So what's the point no matter who wins that game?

    Funfact: techspot.com tombraider comment (2560/1080p both tested-4xSSAA+16af)
    "We expected AMD to do better in Tomb Raider since they supported the title's development, but the R9 290X was 12% slower than the GTX Titan and 3% slower than the GTX 780"
    LOL. I hope they do better with BF4 AMD enhancements. Resident Evil 6 shows titan win also.
    http://www.techspot.com/review/727-radeon-r9-290x/...

    Tomshardware 4K quote:
    "In Gaming At 3840x2160: Is Your PC Ready For A 4K Display?, I concluded that you’d want at least two GeForce GTX 780s for 4K. And although the R9 290X is faster than even the $1000 Titan, I maintain that you need a pair in order to crank your settings up to where they should be."
    That was their ARMA quote...But it applies to all 4K...TWO CARDS. But their benchmarks are really low compared to everyone else for Titan in the same games. It's like took 10-15% off Titan's scores. IE, Bioshock infinite at guru3d shows titan winning 10%, but at toms losing by 20% same game, same res...WTF? That's odd right? Skyrim shows NV domination at 4k (780 also). Almost 20% faster for Titan & 780 (they tied) over Uber. Of course they turned off ALL AA modes to get it playable. Again, you can't just judge 4K by one site's games. Clearly you can find the exact opposite at 4K and come back down to reality (a res you can actually play at above 30fps) and titan is smacking them in a ton of games (far more wins than losses). I could find a ton more if needed but you should get the point. TITAN isn't OWNED at 4K and usually when it is as toms says of Metro "the win is largely symbolic though", yeah at 30fps avg it is pointless even turned down!
  • bronopoly - Thursday, October 24, 2013 - link

    Why shouldn't one of the cards you mentioned be bought for 1080p? I don't know about you, but I prefer to get 120 FPS in games so it matches my monitor w/ lightboost enabled.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Except the Titan is a gaming/workstation hybrid due to its computing ability. Anyone who bought a Titan just for gaming is retarded and paid $350 more than they would have on a 780. Titan shouldn t be compared to 290X for gaming. Its a good card for those who do both gaming and a little bit of computing.
  • looncraz - Thursday, October 24, 2013 - link

    Install a new cooler and the last two of those problems vanish... and you've saved hundreds... you could afford to build a stand-alone water-cooling loop just for the 290x and still have money to spare for a nice dinner.
  • teiglin - Thursday, October 24, 2013 - link

    I haven't finished reading the article yet, but isn't that more than a little hyperbolic? It just means NVIDIA will have to cut back on the amount it gouges for GK110. The fact that it was able to leave the price high for so long is nearly all good for them--it's just a matter of how quickly they adjust their pricing to match.

    It will be nice to have a fair fight again at the high-end for a single card.
  • bill5 - Thursday, October 24, 2013 - link

    Heh, I'm the biggest AMD fanboy around, but these top two comments almost smell like marketing.

    It's a great card, and the Titan was deffo highly overpriced, but Nvidia can just make some adjustments on price and compete. That 780 Ti they showed will surely be something in that vein.

Log in

Don't have an account? Sign up now