Crysis 3

Still one of our most punishing benchmarks 3 years later, Crysis 3 needs no introduction. Crytek’s DX11 masterpiece, Crysis 3’s Very High settings still punish even the best of video cards, never mind the rest. Along with its high performance requirements, Crysis 3 is a rather balanced game in terms of power consumption and vendor optimizations. As a result it can give us a good look at how our video cards stack up on average, and later on in this article how power consumption plays out.

Crysis 3 - 2560x1440 - Very High Quality + FXAA

Crysis 3 - 1920x1080 - Very High Quality + FXAA

 

Battlefield 4 The Witcher 3
Comments Locked

129 Comments

View All Comments

  • CiccioB - Thursday, April 20, 2017 - link

    Yes, better in the few DX12 games optimized for AMD architecture. Where it gains at most 10%... yes, a really selling point up to now, until real DX12 games with no ad-hoc AMD optimization will be released making many user wake up from their wet dreams.
  • Outlander_04 - Thursday, April 20, 2017 - link

    Its not optimization its asynchronous compute . The nVidia architecture cant do it and will never be able to keep up in DX12
  • tipoo - Thursday, April 20, 2017 - link

    Define "can't do it". Pascal does async, just not with per-clock interleaving like AMD
  • Outlander_04 - Thursday, April 20, 2017 - link

    Then it is not asynchronous which quite literally means "at the same time".
    AMD's compute strength is well established by the legions of people who wisely use their cards for bitcoin mining .
  • CiccioB - Friday, April 21, 2017 - link

    Async doesn't really mean "at the same time" at all.
    Possibly, the opposite.
  • CiccioB - Thursday, April 20, 2017 - link

    No optimizations?
    Tell me why DICE's engine runs better on AMD GPUs even in DX11 while all other engines do not.
    Async in DX11? A miracle that suddenly allowed AMD drivers to pass nvidia one in draw calls? Better geometry handling? Better memory and bandwith handling?
    Come on. You AMD fanboy are all looking to the first games in (pseudo) DX12 sponsored by AMD. The future ones will be different (maybe also using nvidia functionalities that AMD does not support and not biased on AMD HW.. AMD can't surely support all AAA developer for working more to use Async, which is not a free functionality, did you know? and tune it for all cards) and for the time DX12 will become mainstream Volta will be old.
    But it's nice that you all go and suggest to buy AMD HW. It should make nvidia one cheaper... should in theory,... probably you do not advertise too much as the prices keeps on staying at the high level. Please suggest to buy CrossFire solutions, so that AMD will sell double the HW and all those new AMD customers can enjoy double performance in..ermm... welll... yes, you know, DX12 does not support CF/SLI natively, so they'll happily play DX11 games at nvidia levels with their CF configurations.

    I bet the Async thing you just said was heard from an AMD friend... wasn't it?
  • Outlander_04 - Thursday, April 20, 2017 - link

    Why is game optimization in DX11 in various game engines [ which could favor either AMD or nvidia] of any relevance to me pointing out the strengths of AMD's architecture in DX12?

    Please try and address what is said, not what you want to think is said . Thanks
  • CiccioB - Friday, April 21, 2017 - link

    It's you that is looking at what you want.
    There are 2 scenarios to analyze:
    DX11 and DX12
    You just pick DX12 ignoring DX11 because it is what you want to advertise and to make you own consideration based only on what you want to see.
    I just made you notice that in DX11 the game is well optimized for AMD architecture seen the performances it obtains, performances that with respect to nvidia no other games have ever reached in DX11.
    So you can't dismiss the simple and clear assertion thati it is an AMD optimized game (engine).
    It is and DX11 demonstrates it. What you see in DX12 is what will be if ALL future games will be optimized for AMD architecture this way. Which won't happen. Other games (always supporting DX12) just shows that they can run better on nvidia HW. Both because they do not have all those work payed by AM to make the game run better on AMD HW and because not all games take advantage of the Async compute (which costs in terms of development, did you understand this or you are living in your own world of bunnies and rainbows?)

    So extrapolating that AMD work well in DX12 just by looking at one engine that is created for running better on their HW (and as I said it is a fact seen also in DX11) it is stupid and just demonstrates a pure lie.
  • Mugur - Thursday, April 20, 2017 - link

    I'm sorry to be another one that points out that the testbed is obsolete (the best approach should be 2 testbeds with i7 7700k and R7 1800X or R5 1600X) and it's missing a few new games (Doom, Battlefield 1, etc.).

    About the cards: they are ok-ish, in my opinion. Nothing spectacular, but it's still a refresh, same price or a bit lower than last year, both cool and quiet even factory overclocked. Nobody should care for a few Watts more than 1060 (which was actually warmer and noisier in the tests), as long as they have a decent PSU.

    As an owner of 2 Freesync monitors, I may go for a 580 8 GB to replace my 470 that would go into the kid's PC. After I see Vega, of course. :-)
  • CiccioB - Thursday, April 20, 2017 - link

    "Few watts"
    It uses double the power for the same work!
    And yes, a bit warmer and noisier.. it was the FE with the blower solution. Take a custom card, it will be still faster than this OC over OC sh*t and with use half the power and be much more cool with less than half the noise.

    It is fascinating to try to understand how people can justify certain incomprehensible choices.

Log in

Don't have an account? Sign up now