A Note On Crossfire, 4K Compatibility, Power, & The Test

Before we dive into our formal testing, there are a few brief testing notes that bear mentioning.

First and foremost, on top of our normal testing we did some additional Crossfire compatibility testing to see if AMD’s new XDMA Crossfire implementation ran into any artifacting or other issues that we didn’t experience elsewhere.  The good news there is that outside of the typical scenarios where games simply don’t scale with AFR – something that affects SLI and CF equally – we didn’t see any artifacts in the games themselves. The closest we came to a problem was with the intro videos for Total War: Rome 2, which have black horizontal lines due to the cards trying to AFR render said video at a higher framerate than it played at. Once in-game Rome was relatively fine; relatively because it’s one of the games we have that doesn’t see any performance benefit from AFR.

Unfortunately AMD’s drivers for 290X are a bit raw when it comes to Crossfire. Of note, when running at a 4K resolution, we had a few instances of loading a game triggering an immediate system reboot. Now we’ve had crashes before, but nothing quite like this. After reporting it to AMD, AMD tells us that they’ve been able to reproduce the issue and have fixed it for the 290X launch drivers, which will be newer than the press drivers we used. Once those drivers are released we’ll be checking to confirm, but we have no reason to doubt AMD at this time.

Speaking of 4K, due to the two controller nature of the PQ321 monitor we use there are some teething issues related to using 4K right now. Most games are fine at 4K, however we have found games that both NVIDIA and AMD have trouble with at one point or another. On the NVIDIA side Metro will occasionally lock up after switching resolutions, and on the AMD side GRID 2 will immediately crash if using the two controller (4K@60Hz) setup. In the case of the latter dropping down to a single controller (4K@30Hz) satisfies GRID while allowing us to test at 4K resolutions, and with V-sync off it doesn’t have a performance impact versus 60Hz, but it is something AMD and Codemasters will need to fix.

Furthermore we also wanted to offer a quick update on the state of Crossfire on AMD’s existing bridge based (non-XDMA) cards. The launch drivers for the 290X do not contain any further Crossfire improvements for bridge based cards, which means Eyefinity Crossfire frame pacing is still broken for all APIs. Of particular note for our testing, the 280X Crossfire setup ends up in a particularly nasty failure mode, simply dropping every other frame. It’s being rendered, as evidenced by the consumption of the Present call, however as our FCAT testing shows it’s apparently not making it to the master card. This has the humorous outcome of making the frame times rather smooth, but it makes Crossfire all but worthless as the additional frames are never displayed. Hopefully AMD can put a fork in the matter once and for all next month.

A Note On Testing Methodologies & Sustained Performance

Moving on to the matter of our testing methodology, we want to make note of some changes since our 280X review earlier this month. After having initially settled on Metro: Last Light for our gaming power/temp/noise benchmark, in a spot of poor planning on our part we have discovered that Metro scales poorly on SLI/CF setups, and as a result doesn't push those setups very hard. As such we have switched from Metro to Crysis 3 for our power/temp/noise benchmarking, as Crysis 3 was our second choice and has a similar degree of consistency to it as Metro while scaling very nicely across both AMD and NVIDIA multi-GPU setups. For single-GPU cards the impact on noise is measurably minor, as the workloads are similar, however power consumption will be a bit different due to the difference in CPU workloads between the benchmarks.

We also want to make quick note of our testing methodologies and how they are or are not impacted by temperature based throttling. For years we have done all of our GPU benchmarking by looping gaming benchmarks multiple times, both to combat the inherent run-to-run variation that we see in benchmarking, and more recently to serve as a warm-up activity for cards with temperature based throttling. While these methods have proved sufficient for the Radeon 7000 series, the GeForce 600 series, and even the GeForce 700 series, due to the laws of physics AMD's 95C throttle point takes longer to get to than NVIDIA's 80C throttle point. As a result it's harder to bring the 290X up to its sustained temperatures before the end of our benchmark runs. It will inevitably hit 95C in quiet mode, but not every benchmark runs long enough to reach that before the 3rd or 4th loop.

For the sake of consistency with past results we have not altered our benchmark methodology. However we wanted to be sure to point out this fact before getting to benchmarking, so that there’s no confusion over how we’re handling the matter. Consequently we believe our looping benchmarks run long enough to generally reach sustained performance numbers, but in all likelihood some of our numbers on the shortest benchmarks will skew low. For the next iteration of our benchmark suite we’re most likely going to need to institute a pre-heating phase for all cards to counter AMD’s 95C throttle point.

The Drivers

The press drivers for the 290X are Catalyst 13.11 Beta v5 (The “v” is AMD’s nomenclature), which identify themselves as being from the driver branch 13.250. These are technically still in the 200 branch of AMD’s drivers, but this is the first appearance of 250, as Catalyst 13.11 Beta v1 was still 13.200. AMD doesn’t offer release notes on these beta drivers, but we found that they offered distinct improvements in GRID 2 and to a lesser extent Battlefield 3, and have updated our earlier results accordingly.

Meanwhile for NVIDIA we’re using the recently released “game ready” 331.58 WHQL drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
OS: Windows 8.1 Pro

 

Meet The Radeon R9 290X Metro: Last Light
Comments Locked

396 Comments

View All Comments

  • Sandcat - Thursday, October 24, 2013 - link

    Perhaps they knew it was unsustainable from the beginning, but short term gains are generally what motivate managers when the develop pricing strategies, because bonus. Make hay whilst the sun shines, or when AMD is 8 months late.
  • chizow - Saturday, October 26, 2013 - link

    Possibly, but now they have to deal with the damaged goodwill of some of their most enthusiastic, spendy customers. I can't count how many times I've seen it, someone saying they swore off company X or company Y because they felt they got burned/screwed/fleeced by a single transaction. That is what Nvidia will be dealing with going forward with Titan early adopters.
  • Sancus - Thursday, October 24, 2013 - link

    AMD really needs to do better than a response 8 months later to crash anyone's parade. And honestly, I would love to see them put up a fight with Maxwell at a reasonable time period so they have incentive to keep prices lower. Otherwise, expect Nvidia to "overprice" things next generation as well.

    When they have no competition for 8 months it's not unsustainable to price as high as the market will bear, and there's no real evidence that Titan was economically overpriced because it's not like there was a supply glut of Titans sitting around anywhere, in fact they were often out of stock. So really, Nvidia is just pricing according to the market -- no competition from AMD for 8 months, fastest card with limited supply, why WOULD they price it at anything below $1000?
  • chizow - Saturday, October 26, 2013 - link

    My reply would be that they've never had to price it at $1000 before, and we have certainly seen this level of advancement from one generation to the next in the past (7900GTX to 8800GTX, 8800GTX to GTX 280, 280 GTX to 480 GTX, etc), so it's not completely ground-breaking performance increases even though Kepler overall outperformed historical improvements by ~20%, imo.

    Also, the concern with Titan isn't just the fact it was priced at ungodly premiums this time around, it's the fact it held it's crown for such a relatively short period of time. Sure Nvidia had no competition at the $500+ range for 8 months, but that was also the brevity of Titan's reign at the top. In the past, a flagship in that $500 or $600+ range would generally reign for the entire generation, especially one that was launched half way through that generation's life cycle. Now Nvidia has already announced a reply with the 780 Ti which will mean not one, but TWO cards will surpass Titan at a fraction of it's price before the generation goes EOL.

    Nvidia was clearly blind-sided by Hawaii and ultimately it will cost them customer loyalty, imo.
  • ZeDestructor - Thursday, October 24, 2013 - link

    $1000 cards are fine, since the Titan is a cheap compute unit compared to the Quadro K6000 and the 690 is a dual-GPU card (Dual-GPU has always been in the $800+ range).

    What we should see is the 780 (Ti?) go down in price and match the R9-290x, much to the rejoicing of all!

    Nvidia got away with $650-750 on the 780 because they could, and THAT is why competition is important, and why I pay attention to AMD even if I have no reason to buy from them over Nvidia (driver support on Linux is a joke). Now they have to match. Much of the same happens in the CPU segement.
  • chizow - Saturday, October 26, 2013 - link

    For those that actually bought the Titan as a cheap compute card, sure Titan may have been a good buy, but I doubt most Titan buyers were buying it for compute. It was marketed as a gaming card with supercomputer guts and at the time, there was still much uncertainty whether or not Nvidia would release a GTX gaming card based on GK110.

    I think Nvidia preyed on these fears and took the opportunity to launch a $1K part, but I knew it was an unsustainable business model for them because it was predicated on the fact Nvidia would be an entire ASIC ahead of AMD and able to match AMD's fastest ASIC (Tahiti) with their 2nd fastest (GK104). Clearly Hawaii has turned that idea on it's head and Nvidia's premium product stack is crashing down in flames.

    Now, we will see at least 4 cards (290/290X, 780/780Ti) that all come close to or exceed Titan performance at a fraction of the price, only 8 months after it's launch. Short reign indeed.
  • TheJian - Friday, October 25, 2013 - link

    The market dictates pricing. As they said, they sell every Titan immediately, so they could probably charge more. But that's because it has more value than you seem to understand. It is a PRO CARD at it's core. Are you unaware of what a TESLA is for $2500? It's the same freaking card with 1 more SMX and driver support. $1000 is GENEROUS whether you like it or not. Gamers with PRO intentions laughed when they saw the $1000 price and have been buying them like mad ever since. No parade has been crashed. They will continue to do this pricing model for the foreseeable future as they have proven there is a market for high-end gamers with a PRO APP desire on top. The first run was 100,000 and sold in days. By contrast Asus Rog Ares 2 had 1000 unit first run and didn't sell out like that. At $1500 it really was a ripoff with no PRO side.

    I think they'll merely need another SMX turned on and 50-100mhz for the next $1000 version which likely comes before xmas :) The PRO perf is what is valued here over a regular card. Your short-lived statement makes no sense. It's been 8 months, a rather long life in gpus when you haven't beaten the 8 month old card in much (I debunked 4k crap already, and pointed to a dozen other games where titan wins at every res). You won't fire up Blender, Premiere, PS CS etc and smoke a titan with 290x either...LOL. You'll find out what the other $450 is for at that point.
  • chizow - Saturday, October 26, 2013 - link

    Yes and as soon as they released the 780, the market corrected itself and Titans were no longer sold out anywhere, clearly a shift indicating the price of the 780 was really what the market was willing to bear.

    Also, there are more differences with their Tesla counterparts than just 1 SMX, Titan lacks ECC support which makes it an unlikely candidate for serious compute projects. Titan is good for hobby compute, anything serious business or research related is going to spend the extra for Tesla and ECC.

    And no, 8-months is not a long time at the top, look at the reigns of previous high-end parts and you will see it is generally longer than this. Even the 580 that preceded it held sway for 14-months before Tahiti took over it's spot. Time at the top is just one part though, the amount which Titan devalued is the bigger concern. When 780 launched 3 months after Titan, you could maybe sell Titan for $800. Now that Hawaii has launched, you could maybe sell it for $700? It's only going to keep going down, what do you think it will sell for once 780Ti beats it outright for $650 or less?
  • Sandcat - Thursday, October 24, 2013 - link

    I noticed your comments on the Tahiti pricing fiasco 2 years ago and generally skip through the comment section to find yours because they're top notch. Exactly what I was thinking with the $550 price point, finally a top-tier card at the right price for 28nm. Long live sanity.
  • chizow - Saturday, October 26, 2013 - link

    Thanks! Glad you appreciated the comments, I figured this business model and pricing for Nvidia would be unsustainable, but I thought it wouldn't fall apart until we saw 20nm Maxwell/Pirate Islands parts in 2014. Hawaii definitely accelerated the downfall of Titan and Nvidia's $1K eagle's nest.

Log in

Don't have an account? Sign up now