Crysis 3

Our final benchmark in our suite needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers, taking back the “most punishing game” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to driver games for the next few years, and Crysis 3 looks to be much the same for 2013.

Unsurprisingly, Crysis 3 is another game where 2560 isn’t really on the table. In fact we have to go all the way to 1920 at High settings to get a framerate above 60fps. By this point the GTX 770 leads over the 7970GE by 16%, a smaller 7% over the GTX 680, and 76% over the GTX 570.

Bioshock Infinite Synthetics
Comments Locked

117 Comments

View All Comments

  • raghu78 - Thursday, May 30, 2013 - link

    what most of reviews. across a wide range of games and you will see these two cards are tied.

    http://www.hardwarecanucks.com/forum/hardware-canu...

    http://www.computerbase.de/artikel/grafikkarten/20...

    http://www.pcgameshardware.de/Geforce-GTX-770-Graf...

    http://www.hardware.fr/articles/896-22/recapitulat...
  • bitstorm - Thursday, May 30, 2013 - link

    It seems to match up with other reviews I have seen. Maybe you are looking at ones that are not using the reference card? The non-reference reviews show it doing a bit better.

    Still even with the better results of the non reference cards it is a bit disappointing of a release from Nvidia IMO. While it is good that it will likely cause AMD to drop the price of the 7970 GE but it won't set a fire under AMD to make an impressive jump on their next lineup refresh.
  • Brainling - Thursday, May 30, 2013 - link

    And if you look at any AMD review, you'll see fanbois jumping out of the wood work to accuse Anand and crew of being Nvidia homers. You can't win for losing I guess.
  • kallogan - Thursday, May 30, 2013 - link

    barely beats 680 at higher power consumption. Turbo boost is useless. Useless gpu. Next.
  • gobaers - Thursday, May 30, 2013 - link

    There are no bad products, only bad prices. If you want to think of this as a 680 with a price cut and modest bump, where is the harm in that?
  • EJS1980 - Thursday, May 30, 2013 - link

    Exactly!
  • B3an - Thursday, May 30, 2013 - link

    I'd glad you mentioned the 2GB VRAM issue Ryan. Because it WILL be a problem soon.

    In the comments for 780 review i was saying that even 3GB VRAM will probably not be enough for the next 18 months - 2 years, atleast for people who game at 2560x1600 and higher (maybe even 1080p with enough AA). As usual many short-sighted idiots didn't agree, when it should be amazingly obvious theres going to be a big VRAM usage jump when these new consoles arrive and their games start getting ported to PC. They will easily be going over 2GB.

    I definitely wouldn't buy the 770 with 2GB. It's not enough and i've had problems with high-end cards running out of VRAM in the past when the 360/PS3 launched. It will happen again with 2GB cards. And it's really not a nice experience when it happens (single digit FPS) and totally unacceptable for hardware this expensive.
  • TheinsanegamerN - Monday, July 29, 2013 - link

    people have been saying that for a long time. i heard the same thing when i bought my 550 ti's. and, 2 years later....only battlefield 3 pushed ppast the 1 GB frame buffer at 1080p, and that was on unplayable setting (everything maxed out). now, if I lower the settings to maintain at least 30fps, no problems. 700 MB usage max. mabye 750 on a huge map. now, at 1440p, i can see this being a problem for 2 gb, but i think 3gb will be just fine for a long time.
  • just4U - Thursday, May 30, 2013 - link

    I don't quite understand why Nvidia's partners wouldn't go with the reference design of the 770. I've been keenly interested in those nice high quality coolers and hoping they'd make their way into the $400 parts. It's a great selling point (I think) and disappointing to know that they won't be using them.
  • chizow - Thursday, May 30, 2013 - link

    I agree, it feels like false advertising or bait and switch given GPU Boost 2.0 relies greatly on operating temps and throttling once you hit 80C.

    Seems a bit irresponsible for Nvidia to send out cards like this and for reviewers to subsequently review and publish the results.

Log in

Don't have an account? Sign up now