• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

A Note On Crossfire, 4K Compatibility, Power, & The Test

Before we dive into our formal testing, there are a few brief testing notes that bear mentioning.

First and foremost, on top of our normal testing we did some additional Crossfire compatibility testing to see if AMD’s new XDMA Crossfire implementation ran into any artifacting or other issues that we didn’t experience elsewhere.  The good news there is that outside of the typical scenarios where games simply don’t scale with AFR – something that affects SLI and CF equally – we didn’t see any artifacts in the games themselves. The closest we came to a problem was with the intro videos for Total War: Rome 2, which have black horizontal lines due to the cards trying to AFR render said video at a higher framerate than it played at. Once in-game Rome was relatively fine; relatively because it’s one of the games we have that doesn’t see any performance benefit from AFR.

Unfortunately AMD’s drivers for 290X are a bit raw when it comes to Crossfire. Of note, when running at a 4K resolution, we had a few instances of loading a game triggering an immediate system reboot. Now we’ve had crashes before, but nothing quite like this. After reporting it to AMD, AMD tells us that they’ve been able to reproduce the issue and have fixed it for the 290X launch drivers, which will be newer than the press drivers we used. Once those drivers are released we’ll be checking to confirm, but we have no reason to doubt AMD at this time.

Speaking of 4K, due to the two controller nature of the PQ321 monitor we use there are some teething issues related to using 4K right now. Most games are fine at 4K, however we have found games that both NVIDIA and AMD have trouble with at one point or another. On the NVIDIA side Metro will occasionally lock up after switching resolutions, and on the AMD side GRID 2 will immediately crash if using the two controller (4K@60Hz) setup. In the case of the latter dropping down to a single controller (4K@30Hz) satisfies GRID while allowing us to test at 4K resolutions, and with V-sync off it doesn’t have a performance impact versus 60Hz, but it is something AMD and Codemasters will need to fix.

Furthermore we also wanted to offer a quick update on the state of Crossfire on AMD’s existing bridge based (non-XDMA) cards. The launch drivers for the 290X do not contain any further Crossfire improvements for bridge based cards, which means Eyefinity Crossfire frame pacing is still broken for all APIs. Of particular note for our testing, the 280X Crossfire setup ends up in a particularly nasty failure mode, simply dropping every other frame. It’s being rendered, as evidenced by the consumption of the Present call, however as our FCAT testing shows it’s apparently not making it to the master card. This has the humorous outcome of making the frame times rather smooth, but it makes Crossfire all but worthless as the additional frames are never displayed. Hopefully AMD can put a fork in the matter once and for all next month.

A Note On Testing Methodologies & Sustained Performance

Moving on to the matter of our testing methodology, we want to make note of some changes since our 280X review earlier this month. After having initially settled on Metro: Last Light for our gaming power/temp/noise benchmark, in a spot of poor planning on our part we have discovered that Metro scales poorly on SLI/CF setups, and as a result doesn't push those setups very hard. As such we have switched from Metro to Crysis 3 for our power/temp/noise benchmarking, as Crysis 3 was our second choice and has a similar degree of consistency to it as Metro while scaling very nicely across both AMD and NVIDIA multi-GPU setups. For single-GPU cards the impact on noise is measurably minor, as the workloads are similar, however power consumption will be a bit different due to the difference in CPU workloads between the benchmarks.

We also want to make quick note of our testing methodologies and how they are or are not impacted by temperature based throttling. For years we have done all of our GPU benchmarking by looping gaming benchmarks multiple times, both to combat the inherent run-to-run variation that we see in benchmarking, and more recently to serve as a warm-up activity for cards with temperature based throttling. While these methods have proved sufficient for the Radeon 7000 series, the GeForce 600 series, and even the GeForce 700 series, due to the laws of physics AMD's 95C throttle point takes longer to get to than NVIDIA's 80C throttle point. As a result it's harder to bring the 290X up to its sustained temperatures before the end of our benchmark runs. It will inevitably hit 95C in quiet mode, but not every benchmark runs long enough to reach that before the 3rd or 4th loop.

For the sake of consistency with past results we have not altered our benchmark methodology. However we wanted to be sure to point out this fact before getting to benchmarking, so that there’s no confusion over how we’re handling the matter. Consequently we believe our looping benchmarks run long enough to generally reach sustained performance numbers, but in all likelihood some of our numbers on the shortest benchmarks will skew low. For the next iteration of our benchmark suite we’re most likely going to need to institute a pre-heating phase for all cards to counter AMD’s 95C throttle point.

The Drivers

The press drivers for the 290X are Catalyst 13.11 Beta v5 (The “v” is AMD’s nomenclature), which identify themselves as being from the driver branch 13.250. These are technically still in the 200 branch of AMD’s drivers, but this is the first appearance of 250, as Catalyst 13.11 Beta v1 was still 13.200. AMD doesn’t offer release notes on these beta drivers, but we found that they offered distinct improvements in GRID 2 and to a lesser extent Battlefield 3, and have updated our earlier results accordingly.

Meanwhile for NVIDIA we’re using the recently released “game ready” 331.58 WHQL drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 290X
XFX Radeon R9 280X Double Dissipation
AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7970
AMD Radeon HD 6970
AMD Radeon HD 5870
NVIDIA GeForce GTX Titan
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 770
Video Drivers: NVIDIA Release 331.58
AMD Catalyst 13.11 Beta v1
AMD Catalyst 13.11 Beta v5
OS: Windows 8.1 Pro

 

Meet The Radeon R9 290X Metro: Last Light
POST A COMMENT

396 Comments

View All Comments

  • SolMiester - Monday, October 28, 2013 - link

    So you can OC a 780 on stock, but not the 290x to sustain the OC, which means 780 wins!, especially after the price drop to $500!, oh dear AMD 290x just went from hero to zero... Reply
  • TheJian - Friday, October 25, 2013 - link

    I gave links and named the games previously...See my post. At 1080p 780 trades blows depending on the games. Considering 98.75% of us are 1920x1200 or less, that is important and you get 3 AAA games with 780, on top of the fact that it's using far less watts, less noise and less heat. A simple drop in price of $50-100 and 780 seems like a no brainer to me (disregarding the 780TI which should keep the same price as now I'd guess). Granted Titan needs a dunk in price now too, which I'm sure will come or they'll just replace it with a full SMX up-clocked titan to keep that price. I'm guessing old titan just died as 780TI will likely beat it in nearly everything if the rumored clock speed and extra smx are true. They will have to release a new titan ULTRA or something with another smx or up the mhz to 1ghz or something. OR hopefully BOTH.

    I'm guessing it's easier to just up the 100mhz or put it to 1ghz as surely manufacturing has gotten them to where all will do this now, more than having all SMX's defect free. Then again if you have a bad SMX just turn a few more off and it's a 780TI anyway. They've had 8 months to either pile up cherry picked ones, or just improve totally anyway so more can do this easily. Clearly 780ti was just waiting in the wings already. They were just waiting to see 290x perf and estimates.
    Reply
  • eddieveenstra - Sunday, October 27, 2013 - link

    Titan died when 780gtx entered the room at 600 Euro. I'm betting Nvidia only brings a 780gtx ti and that's it. Titan goes EOL. Reply
  • anubis44 - Thursday, October 24, 2013 - link

    This is the reference card. It's not loud unless you set it to 'Uber' mode, and even then, HardOCP thought the max fan speed should be set to 100% rather than 55%. Imagine how quiet an Asus Direct CUIII or Gigabyte Windforce or Sapphire Toxic custom cooled R9 290x will be.

    Crossfire and frame pacing all working, and R9 290X crushes Titan in 4K gaming (read HardOCP's review of this 4K section), all while costing $100 less than GTX780, and the R9 280X (7970) is priced at $299, and the R9 270X (7870) is now going for $180, and now Mantle API could be the next 3dfx Glide, and boost all 7000-series cards and higher dramatically for free...

    It's like AMD just pulled out a light sabre and cut nVidia right in half while Jsen Hsun just stares dumbly at them in disbelief. He should have merged nVidia with AMD when he had the chance. Could be too late now.
    Reply
  • Shark321 - Thursday, October 24, 2013 - link

    There will be no custom cooling solution for the time being. It's the loudest card ever released. Twice as loud as 780/Titan in BF3 after 10 minutes of playing. Also Nvidia will bringt the 780Ti in 3 weeks, a faster cart at a comparable price, but quiet. AMD releases the 290x one year after NVidia, 2 years after NVidias tipeout. Nvidia will be able to counter this with a wink. Reply
  • just4U - Thursday, October 24, 2013 - link

    Shark Writes: "It's the loudest card ever released."

    Guess you weren't around for the Geforce5...
    Reply
  • HisDivineOrder - Thursday, October 24, 2013 - link

    The FX5800 is not ever dead. Not if we remember the shrill sound of its fans...

    ...or if the sound burned itself into our brains for all time.
    Reply
  • Samus - Friday, October 25, 2013 - link

    I think the 65nm GeForce 280 takes the cake for loudest card ever made. It was the first card with a blower. Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    lol, the Ti can only do so much, there is no smaller node for either company to jump to, not until March for enough shipments to have stock for sales. The 290X just proves AMD's GCN design is a keeper. It is getting massively throttled by heat and still manages to pull a slight lead over the titan, at sometimes 15% lower clocks than reference. AMD needed a brand for this release season, and they have it.

    Both Nvidia and AMD are jumping to the next node in 2014. Nvidia will not release Maxwell on the current node. And there is no other node they would invest in going to.
    Reply
  • HisDivineOrder - Thursday, October 24, 2013 - link

    The Ti could theoretically open up all the disabled parts of the current GK110 part. Doing that, who knows what might happen? We've yet to see a fully enabled GK110. I suspect that might eat away some of the Titan's efficiency advantage, though. Reply

Log in

Don't have an account? Sign up now