The 2GB Question & The Test

Before diving into our test results, I wanted to spend a moment mulling over NVIDIA’s choice for the default memory configuration on GTX 770. Due to the use of a 256bit bus on GK104, NVIDIA limits their practical memory choices to either 2GB of RAM or 4GB. A year ago this was fine even if it wasn’t as large as AMD’s 3GB memory pool, but that was after all a year ago.

Not unlike where we are with 1GB/2GB on mainstream ($150+) cards, we’re at a similar precipice with these enthusiast class cards. Having 2GB of RAM doesn’t impose any real problems today, but I’m left to wonder for how much longer that’s going to be true. The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. The worst case scenario is only that these highest quality assets may not be usable at playable performance, but considering the high performance of every other aspect of GTX 770 that would be a distinct and unfortunate bottleneck.

The solution for better or worse is doubling the GTX 770 to 4GB. GTX 770 is capable of housing 4GB, and NVIDIA’s partners will be selling 4GB cards in the near future, so 4GB cards will at least be an option. The price premium for 4GB of RAM looks to be around $20-$30, and I expect that will come down some as 4Gb chips start to replace 2Gb chips. 4GB would certainly make the GTX 770 future-proof in that respect, and I suspect it’s a good idea for anyone on a long upgrade cycle, but as always this is a bit of a gamble.

Though I can’t help but feel NVIDIA could have simply sidestepped the whole issue by making 4GB the default, rather than an optional upgrade. As it stands 2GB feels shortsighted, and for a $400 card, a bit small. Given the low cost of additional RAM, a 4GB baseline likely would have been bearable.

The Test

For today’s launch article we’re using NVIDIA’s 320.18 drivers for the GTX 780 and GTX 770, , and AMD’s Catalyst 13.5b2 drivers for all AMD cards.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards: AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7990
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX Titan
Video Drivers: NVIDIA ForceWare 320.14
NVIDIA ForceWare 320.18
AMD Catalyst 13.5 Beta 2
OS: Windows 8 Pro
Meet The GeForce GTX 770 DiRT: Showdown
POST A COMMENT

117 Comments

View All Comments

  • chizow - Thursday, May 30, 2013 - link

    They are both overpriced relative to their historical cost/pricing, as a result you see Nvidia has posted record margins last quarter, and will probably do similarly well again. Reply
  • Razorbak86 - Thursday, May 30, 2013 - link

    Cool! I'm both a customer and a shareholder, but my shares are worth a hell of a lot more than my SLi cards. :) Reply
  • antef - Thursday, May 30, 2013 - link

    I'm not happy that NVIDIA threw power efficiency to the wind this generation. What is with these GPU manufacturers that they can't seem to CONSISTENTLY focus on power efficiency? It's always...."Oh don't worry, next gen will be better we promise," then it finally does get better, then next gen sucks, then again it's "don't worry, next gen we'll get power consumption down, we mean it this time." How about CONTINUING to focus on it? Imagine any other product segment where a 35%! power increase would be considered acceptable, there is none. That makes a 10 or whatever FPS jump not impressive in the slightest. I have a 660 Ti which I feel has an amazing speed to power efficiency ratio, looks like this generation definitely needs to be sat out. Reply
  • jwcalla - Thursday, May 30, 2013 - link

    It's going to be hard to get a performance increase without sacrificing some power while using the same architecture. You pretty much need a new architecture to get both. Reply
  • jasonelmore - Thursday, May 30, 2013 - link

    or a die shrink Reply
  • Blibbax - Thursday, May 30, 2013 - link

    As these cards have configurable TDP, you get to choose your own priorities. Reply
  • coldpower27 - Thursday, May 30, 2013 - link

    There isn't much you can really do when your working with the same process node and same architecture, the best you can hope for is a slight bump in efficiency at the same performance level but if you increase performance past the sweet spot, you sacrifice efficiency.

    In past generation you had half node shrinks. GTX 280 -> GTX 285 65nm to 55nm and hence reduced power consumption.

    Now we don't, we have jumped straight from 55nm -> 40nm -> 28nm, with the next 20nm node still aways out. There just isn't very much you can do right now for performance.
    Reply
  • JDG1980 - Thursday, May 30, 2013 - link

    Yes, this is really TSMC's fault. They've been sitting on their ass for too long. Reply
  • tynopik - Thursday, May 30, 2013 - link

    maybe a shade of NVIDIA green for the 770 in the charts instead of AMD red? Reply
  • joel4565 - Thursday, May 30, 2013 - link

    Looks like an interesting part. If for no other reason that to put pressure on AMD's 7950 Ghz card. I imagine that card will be dropping to 400ish very soon.

    I am not sure what card to pick up this summer. I want to buy my first 2560x1440 monitor (leaning towards Dell 2713hm) this summer, but that means I need a new video card too as my AMD 6950 is not going to have the muscle for 1440p. It looks like both the Nvida 770 and AMD 7950 Ghz are borderline for 1440p depending on the game, but there is a big price jump to go to the Nvidia 780.

    I am also not a huge fan of crossfire/sli although I do have a compatible motherboard. Also to preempt the 2560/1440 vs 2560/1600 debate, yes i would of course prefer more pixels, but most of the 2560x1600 monitors I have seen are wide gamut which I don't need and cost 300-400 more. 160 vertical pixels are not worth 300-400 bucks and dealing with the Wide gamut issues for programs that aren't compatible.
    Reply

Log in

Don't have an account? Sign up now