• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

Final Words

Bringing this review to a close, going into this launch AMD has been especially excited about the 290X and it’s easy to see why. Traditionally AMD has not been able to compete with NVIDIA’s big flagship GPUs, and while that hasn’t stopped AMD from creating a comfortable spot for themselves, it does mean that NVIDIA gets left to their own devices. As such while the sub-$500 market has been heavily competitive this entire generation, the same could not be said about the market over $500 until now. And although a niche of a niche in terms of volume, this market segment is where the most powerful of video cards reside, so fierce competition here not only brings down the price of these flagship cards sooner, but in the process it inevitably pushes prices down across the board. So seeing AMD performance competitive with GTX Titan and GTX 780 with their own single-GPU card is absolutely a breath of fresh air.

Getting down to business then, AMD has clearly positioned the 290X as a price/performance monster, and while that’s not the be all and end all of evaluating video cards it’s certainly going to be the biggest factor to most buyers. To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.

Consequently against NVIDIA’s pricing structure the 290X is by every definition a steal at $549. Even if it were merely equal to the GTX 780 it would still be $100 cheaper, but instead it’s both faster and cheaper, something that has proven time and again to be a winning combination in this industry. Elsewhere the fact that it can even tie GTX Titan is mostly icing on the cake – for traditional gamers Titan hasn’t made a lot of sense since GTX 780 came out – but nevertheless it’s an important milestone for AMD since it’s a degree of parity they haven’t achieved in years.

But with that said, although the 290X has a clear grip on performance and price it does come at the cost of power and cooling. With GTX Titan and GTX 780 NVIDIA set the bar for power efficiency and cooling performance on a high-end card, and while it’s not necessarily something that’s out of AMD’s reach it’s the kind of thing that’s only sustainable with high video card prices, which is not where AMD has decided to take the 290X. By focusing on high performance AMD has had to push quite a bit of power through 290X, and by focusing on price they had to do so without blowing their budget on cooling. The end result is that the 290X is more power hungry than any comparable high-end card, and while AMD is able to effectively dissipate that much heat the resulting cooling performance (as measured by noise) is at best mediocre. It’s not so loud as to be intolerable for a single-GPU setup, but it’s as loud as can be called reasonable, never mind preferable.

On that note, while this specific cooler implementation leaves room for improvement the underlying technology has turned out rather well thanks to AMD’s PowerTune improvements. Now that AMD has fine grained control over GPU clockspeeds and voltages and the necessary hardware to monitor and control the full spectrum of power/temp/noise, it opens up the door to more meaningful ways of adjusting the card and monitoring its status. Admittedly a lot of this is a retread of ground NVIDIA already covered with GPU Boost 2, but AMD’s idea for fan throttling is in particular a more intuitive method of controlling GPU noise than trying to operate by proxy via temperature and/or power.

Meanwhile 290X Crossfire performance also ended up being a much welcomed surprise thanks in large part to AMD’s XDMA engine. The idea of exclusively using the PCI-Express bus for inter-GPU communication on a high-end video card was worrying at first given the inherent latency that comes PCIe, but to the credit of AMD’s engineers they have shown that it can work and that it works well. AMD is finally in a position where their multi-GPU frame pacing is up to snuff in all scenarios, and while there’s still some room for improvement in further reducing overall variance we’re to the point where everything up to and including 4K is working well. AMD still faces a reckoning next month when they attempt to resolve their frame pacing issues on their existing products, but at the very least going forward AMD has the hardware and the tools they need to keep the issue under control. Plus this gets rid of Crossfire bridges, which is a small but welcome improvement.

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

Meanwhile AMD for their part would appear to have one more piece to play. Today we’ve seen the Big Kahuna, but retailers are already listing the R9 290, which based on AMD’s new naming scheme would be AMD’s lower tier Hawaii card. How that will pan out remains to be seen, but as a product clearly intended to fill in the $250 gap between 290X and 280X while also making Hawaii a bit more affordable, we certainly have high expectations for its performance. And if nothing else we’d certainly expect it to further ratchet up the pressure on NVIDIA.

Power, Temperature, & Noise
POST A COMMENT

396 Comments

View All Comments

  • TheJian - Friday, October 25, 2013 - link

    Incorrect. Part of the point of gsync is when you can do 200fps in a particular part of the game they can crank up detail and USE the power you have completely rather than making the whole game for say 60fps etc. Then when all kinds of crap is happening on screen (50 guys shooting each other etc) they can drop the graphics detail down some to keep things smooth. Gsync isn't JUST frame rate. You apparently didn't read the anandtech live blog eh? You get your cake and can eat it too (stutter free, no tearing, smooth and extra power used when you have it available). Reply
  • MADDER1 - Friday, October 25, 2013 - link

    If Gsync drops the detail to maintain fps like you said, then you're really not getting the detail you thought you set. How is that having your cake and eating it too? Reply
  • Cellar Door - Friday, October 25, 2013 - link

    How so? If Mantle gets 760gtx performance in BF4 from a 260X ..will you switch then? Reply
  • Animalosity - Sunday, October 27, 2013 - link

    No. You are sadly mistaken sir. Reply
  • Antiflash - Thursday, October 24, 2013 - link

    I've usually prefer Nvidia Cards, but they have it well deserved when decided to price GK110 to the stratosphere just "because they can" and had no competition. That's poor way to treat your customers and taking advantage of fanboys. Full implementation of Tesla and Fermi were always priced around $500. Pricing Keppler GK110 at $650+ was stupid. It's silicon after all, you should get more performance for the same price each year. Not more performance at a premium price as Nvidia tried to do this generation. AMD is not doing anything extraordinary here they are just not following nvidia price gouging practices and $550 is their GPU at historical market prices for their flagship GPU. We would not have been having this discussion if Nvidia had done the same with GK110. Reply
  • inighthawki - Thursday, October 24, 2013 - link

    " It's silicon after all, you should get more performance for the same price each year"

    So R&D costs come from where, exactly? Not sure why people always forget that there is actual R&D that goes into these types of products, it's not just some $5 just of plastic and silicon + some labor and manufacturing costs. Like when they break down phones and tablets and calculate costs they never account for this. As if their engineers are basically just selecting components on newegg and plugging them together.
    Reply
  • jecastejon - Thursday, October 24, 2013 - link

    R&D. Is R&D tied only to a specific Nvidia card? AMD as others also invest a lot in R&D with every product generation, even if they are not successful. Nvidia will have to do a reality cheek with their pricing and be loyal to their fans and the market. Today's advantages don't last for to long. Reply
  • Antiflash - Thursday, October 24, 2013 - link

    NVIDIA's logic. Kepler refresh: 30% more performance => 100% increase in price
    AMD's logic. GCN refresh: is 30% more preformance => 0% increase in price
    I can't see how this is justified by R&D of just a greedy company squishing its more loyal customer base.
    Reply
  • Antiflash - Thursday, October 24, 2013 - link

    Just for clarification. price comparison between cards at its introduction comparing NVIDIA's 680 vs Titan and AMD's 7970 vs 290x Reply
  • TheJian - Friday, October 25, 2013 - link

    AMD way=ZERO PROFITS company going broke, debt high, 6Bil losses in 10yrs
    NV way=500-800mil profits per year so you can keep your drivers working.

    Your love of AMD's pricing is dumb. They are broke. They have sold nearly everything they have or had, fabs, land, all that is left is the company itself and IP.

    AMD should have priced this card at $650 period. Also note, NV hasn't made as much money as 2007 for 6 years. They are not gouging you or they would make MORE than before in 2007 right? Intel, Apple, MS are gouging you as they all make more now than then (2007 was pretty much highs for a lot of companies, down since). People like you make me want to vomit as you just are KILLING AMD, which in turn will eventually cost me dearly buying NV cards as they will be the only ones with real power in the next few years. AMD already gave up the cpu race. How much longer you think they can fund the gpu race with no profits? 200mil owed to GF in Dec 31, so the meager profit they made last Q and any they might have made next Q is GONE. They won't make 200mil profit next Q...LOL. Thanks to people like you asking for LOW pricing and free games.

    You don't even understand you are ANTI-AMD...LOL. Your crap logic is killing them (and making NV get 300mil less profit than 2007). The war is hurting them both. I'd rather have AMD making 500mil and NV making 1B than what we get now AMD at ZERO and NV at 550mil.
    Reply

Log in

Don't have an account? Sign up now