The 2GB Question & The Test

Before diving into our test results, I wanted to spend a moment mulling over NVIDIA’s choice for the default memory configuration on GTX 770. Due to the use of a 256bit bus on GK104, NVIDIA limits their practical memory choices to either 2GB of RAM or 4GB. A year ago this was fine even if it wasn’t as large as AMD’s 3GB memory pool, but that was after all a year ago.

Not unlike where we are with 1GB/2GB on mainstream ($150+) cards, we’re at a similar precipice with these enthusiast class cards. Having 2GB of RAM doesn’t impose any real problems today, but I’m left to wonder for how much longer that’s going to be true. The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. The worst case scenario is only that these highest quality assets may not be usable at playable performance, but considering the high performance of every other aspect of GTX 770 that would be a distinct and unfortunate bottleneck.

The solution for better or worse is doubling the GTX 770 to 4GB. GTX 770 is capable of housing 4GB, and NVIDIA’s partners will be selling 4GB cards in the near future, so 4GB cards will at least be an option. The price premium for 4GB of RAM looks to be around $20-$30, and I expect that will come down some as 4Gb chips start to replace 2Gb chips. 4GB would certainly make the GTX 770 future-proof in that respect, and I suspect it’s a good idea for anyone on a long upgrade cycle, but as always this is a bit of a gamble.

Though I can’t help but feel NVIDIA could have simply sidestepped the whole issue by making 4GB the default, rather than an optional upgrade. As it stands 2GB feels shortsighted, and for a $400 card, a bit small. Given the low cost of additional RAM, a 4GB baseline likely would have been bearable.

The Test

For today’s launch article we’re using NVIDIA’s 320.18 drivers for the GTX 780 and GTX 770, , and AMD’s Catalyst 13.5b2 drivers for all AMD cards.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards: AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7990
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX Titan
Video Drivers: NVIDIA ForceWare 320.14
NVIDIA ForceWare 320.18
AMD Catalyst 13.5 Beta 2
OS: Windows 8 Pro
Meet The GeForce GTX 770 DiRT: Showdown
Comments Locked

117 Comments

View All Comments

  • raghu78 - Thursday, May 30, 2013 - link

    what most of reviews. across a wide range of games and you will see these two cards are tied.

    http://www.hardwarecanucks.com/forum/hardware-canu...

    http://www.computerbase.de/artikel/grafikkarten/20...

    http://www.pcgameshardware.de/Geforce-GTX-770-Graf...

    http://www.hardware.fr/articles/896-22/recapitulat...
  • bitstorm - Thursday, May 30, 2013 - link

    It seems to match up with other reviews I have seen. Maybe you are looking at ones that are not using the reference card? The non-reference reviews show it doing a bit better.

    Still even with the better results of the non reference cards it is a bit disappointing of a release from Nvidia IMO. While it is good that it will likely cause AMD to drop the price of the 7970 GE but it won't set a fire under AMD to make an impressive jump on their next lineup refresh.
  • Brainling - Thursday, May 30, 2013 - link

    And if you look at any AMD review, you'll see fanbois jumping out of the wood work to accuse Anand and crew of being Nvidia homers. You can't win for losing I guess.
  • kallogan - Thursday, May 30, 2013 - link

    barely beats 680 at higher power consumption. Turbo boost is useless. Useless gpu. Next.
  • gobaers - Thursday, May 30, 2013 - link

    There are no bad products, only bad prices. If you want to think of this as a 680 with a price cut and modest bump, where is the harm in that?
  • EJS1980 - Thursday, May 30, 2013 - link

    Exactly!
  • B3an - Thursday, May 30, 2013 - link

    I'd glad you mentioned the 2GB VRAM issue Ryan. Because it WILL be a problem soon.

    In the comments for 780 review i was saying that even 3GB VRAM will probably not be enough for the next 18 months - 2 years, atleast for people who game at 2560x1600 and higher (maybe even 1080p with enough AA). As usual many short-sighted idiots didn't agree, when it should be amazingly obvious theres going to be a big VRAM usage jump when these new consoles arrive and their games start getting ported to PC. They will easily be going over 2GB.

    I definitely wouldn't buy the 770 with 2GB. It's not enough and i've had problems with high-end cards running out of VRAM in the past when the 360/PS3 launched. It will happen again with 2GB cards. And it's really not a nice experience when it happens (single digit FPS) and totally unacceptable for hardware this expensive.
  • TheinsanegamerN - Monday, July 29, 2013 - link

    people have been saying that for a long time. i heard the same thing when i bought my 550 ti's. and, 2 years later....only battlefield 3 pushed ppast the 1 GB frame buffer at 1080p, and that was on unplayable setting (everything maxed out). now, if I lower the settings to maintain at least 30fps, no problems. 700 MB usage max. mabye 750 on a huge map. now, at 1440p, i can see this being a problem for 2 gb, but i think 3gb will be just fine for a long time.
  • just4U - Thursday, May 30, 2013 - link

    I don't quite understand why Nvidia's partners wouldn't go with the reference design of the 770. I've been keenly interested in those nice high quality coolers and hoping they'd make their way into the $400 parts. It's a great selling point (I think) and disappointing to know that they won't be using them.
  • chizow - Thursday, May 30, 2013 - link

    I agree, it feels like false advertising or bait and switch given GPU Boost 2.0 relies greatly on operating temps and throttling once you hit 80C.

    Seems a bit irresponsible for Nvidia to send out cards like this and for reviewers to subsequently review and publish the results.

Log in

Don't have an account? Sign up now