Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Crysis 3 - 1920x1080 - High Quality + FXAA

Crysis 3 is another game where the outcome between the R9 290XU and GTX 970 depends on the resolution. At 4K Low the GTX 970 trails the R9 290XU by 10%, only for the two to get within a frame of each other at 1440p High. Past that it’s all NVIDIA at 1080p. This once again neatly illustrates that AMD still holds a general resolution scaling advantage over NVIDIA and the Maxwell 2 architecture. Though since we’re looking at a $329 card that’s cheaper than any 4K monitor, that’s not an advantage that’s going to be of much value in the real world.

Battlefield 4 Crysis: Warhead
Comments Locked

155 Comments

View All Comments

  • hammer256 - Saturday, September 27, 2014 - link

    It would not surprise me if GM204 is crippled in FP64 in a similar way to GK104, with physically limited number of FP64 cores.
    Regarding to GK110, how the die are selected between FP64 crippled and professional cards is not known. You can imagine a case where the dies with defects in the FP64 cores can still be used in gamer cards, and thus have a bit more yield. But that's pure speculation, of course.
    Either way, Nvidia does this because this makes them more money, and they can get away with it. If you remember from your class in micro-economics, when the industry is in a state of monopoly or oligopoly, segmentation is the way to go for profit maximization. Unless AMD is willing to not segment their products, there is no pressure for Nvidia to change what they are doing.
    So we can argue that consumers are the losers in this state of things, and generally in monopoly and oligopoly that is indeed the case. But in this specific case with FP64, I have to ask: are there many/any consumer relevant applications that could really benefit from FP64? I'm curious to know. I would say that in order for these companies to care, the application need to have sufficient general relevance in the same order of magnitude as that for graphics.
    Those of us who uses the GPU in scientific computation such as simulations are the real losers in this trend. But then again, we were fortunate to have had this kind of cheap, off the shelf hardware that were so powerful for what we do. Looks that ride is coming to an end, at least for the foreseeable future. Personally, my simulation doesn't really benefit from double precision, so I'm pretty lucky. Even then I found that stepping from the GTX580 to a GTX680 core didn't improve performance at all. The silver lining there was that GTX690 had much better performance that the GTX590 for me, and I was able to get 4 GTX690's for some excellent performance. A GTX990 would be tempting, or maybe just wait for the 20nm iteration...
  • anubis44 - Wednesday, October 22, 2014 - link

    Of course GM204 is crippled in FP64. That's where nVidia is finding the improved power budget and reduction in wattage requirement. Frankly, I think it's pretty cheesy, and I've stopped listening to people creaming their jeans about how fabulous nVidia's low power is compared with AMD's. Of course it's going to loose it's power requirements if you cripple the hell out of it. Duh. The question is whether you will line up to get shafted with all the other drones, or if you'll protest this stupidity by buying AMD instead, and give nVidia the finger for this, as they rightly deserve. If we don't, AMD will have to take its FP64 circuitry out of their cards to compete.
  • D. Lister - Sunday, September 28, 2014 - link

    What I said earlier had nothing to do with efficiency. If you were a prosumer and were in the market for double precision hardware... why would you want a $3000 pro GPU when you can get nearly the same performance from a <$1000 consumer variant? Not everyone cares for ECC VRAM. HPC guys et al would be all over it, resulting in an unfairly inflated retail value for the rest of us. When that happens, Nvidia is the one that gets the bad rep, just like AMD did during the bit mining fad. Why do you believe it is so important anyway?
  • Subyman - Friday, September 26, 2014 - link

    Looking at the PCB, the FTW version does not have more VRMs than the SC or normal EVGA model. I only see four chokes, which is what the other cards have. MSI has 6 VRMs. I'm wondering if EVGA is also using the same low-end analog VRMs that the SC and regular EVGA cards use as well. All other 970's use higher end VRMs.
  • wetwareinterface - Saturday, September 27, 2014 - link

    the ftw is not the top end designation it isn't even better than the sc cards in most cases it's lower clocked than the sc and just has extra ram.

    for evga the cards are custom clocked cards are in order

    sc
    ftw
    ssc
    sc signature
    classified

    again the ftw can have lower clocks than the sc or the same clocks but usually has more ram
  • Subyman - Saturday, September 27, 2014 - link

    I never said it was. The article mentioned it had 1 more power phase than the others, but from the pictures it obviously doesn't.
  • Subyman - Friday, September 26, 2014 - link

    Also, we really need a round up of all the brands on here. Seeing the FTW version vs reference doesn't paint a usable picture for those looking to make a purchase.
  • Mr Perfect - Friday, September 26, 2014 - link

    Is anyone going to pair this with the 980's blower? That would be quite impressive.

    Oh, and get the 970's IO up to par. Again, the 980's configuration would be better. Dual DVI indeed...
  • Margalus - Friday, September 26, 2014 - link

    pny has a 970 with the full complement of output's. 3 dp, 1 hdmi 2 and 1 dvi. It really pisses me off that most of the top tier makers like EVGA and ASUS decided to switch that to 1 dp, 1 hdmi and 2 dvi...
  • pixelstuff - Friday, September 26, 2014 - link

    Same here. Annoyed.

Log in

Don't have an account? Sign up now