The 2GB Question & The Test

If it seems to like 1GB+ video cards have been common for ages now, you wouldn’t be too far off. 1GB cards effectively became mainstream in 2008 with the release of the 1GB Radeon HD 4870, which was followed by 2GB cards pushing out 1GB cards as the common capacity for high-end cards in 2010 with the release of the Radeon HD 6970. Since then 2GB cards have been trickling down AMD and NVIDIA’s product stacks while at the same time iGPUs have been making the bottoms of those stacks redundant.

For this generation AMD decided to make their cutoff the 7800 series earlier this year; the 7700 series would be 1GB by default, while the 7800 series and above would be 2GB or more. AMD has since introduced the 7850 1GB as a niche product (in large part to combat the GTX 650 Ti), but the 7850 is still predominantly a 2GB card. For NVIDIA on the other hand the line is being drawn between the GTX 660 and GTX 650; the GTX 660 is entirely 2GB, while the GTX 650 and GTX 650 Ti are predominantly 1GB cards with some 2GB cards mixed in as a luxury option.

The reason we bring this up is because while this is very clear from a video card family perspective, it doesn’t really address performance expectations. Simply put, at what point does a 2GB card become appropriate? When AMD or NVIDIA move a whole product line the decision is made for you, but when you’re looking at a split product like the GTX 650 Ti or the 7850 then the decision is up to the buyer and it’s not always an easy decision.

To try to help with that decision, we’ve broken down the performance of several games on both cards with both 1GB and 2GB models, listing the performance of 2GB cards relative to 1GB cards. By looking for performance advantages, we can hopefully better quantify the benefits of a 2GB card.

Regardless of whether we’re looking at AMD or NVIDIA cards, there’s only one benchmark where 2GB cards have a clear lead: Skyrim at 1920 with the high resolution texture pack. For our other 9 games the performance difference is miniscule at best.

But despite the open-and-shut nature of our data we’re not ready to put our weight behind these results. Among other issues, our benchmark suite is approaching a year old now, which means it doesn’t reflect on some of the major games released in the past few months such as XCOM: Enemy Unknown, Borderlands 2, DiRT: Showdown, or for that matter any one of the number of games left to be released this year. While we have full confidence in our benchmark suite from a competitive performance perspective, the fact that it’s not forward looking (and mostly forward rendering) does a disservice to measuring the need for additional memory.

The fact of the matter is that while the benchmarks don’t necessarily show it, along with Skyrim we’ve seen Crysis clobber 1GB cards in the past, Shogun II’s highest settings won’t even run on a 1GB card, and at meanwhile Battlefield 3 scales up render distance with available video memory. Nearly half of our benchmark games do benefit from 2GB cards, a subjective but important quality.

So despite the fact that our data doesn’t immediately show the benefits of 2GB cards, our thoughts go in the other direction. As 2012 comes to a close, cards that can hit the GTX 650 Ti’s performance level  are not well equipped for future with only 1GB of VRAM. 1GB is the cheaper option – and at these prices every penny counts – but it is our belief that by this time next year 1GB cards will be in the same place 512MB cards were in 2010: bottlenecked by a lack of VRAM. We have reached that point where if you’re going to be spending $150 or more that you shouldn’t be settling for a 1GB card; this is the time where 2GB cards are going to become the minimum for performance gaming video cards.

The Test

NVIDIA’s GTX 650 Ti launch driver is 306.38, which are a further continuation of the 304.xx branch. Compared to the previous two 304.xx drivers there are no notable performance changes or bug fixes that we’re aware of.

Meanwhile on the AMD side we’re using AMD’s newly released 12.9 betas. While these drivers are from a new branch, compared to the older 12.7 drivers the performance gains are minimal. We have updated our results for our 7000 series cards, but the only difference as it pertains to our test suite is that performance in Shogun II and DiRT 3 is slightly higher than with the 12.7 drivers.

On a final note, AMD sent over XFX’s Radeon HD 7850 1GB card so that we had a 1GB 7850 to test with (thanks guys). As this is not a new part from a performance perspective (see above) we’re not doing anything special with this card beyond including it in our charts as validation of the fact that the 1GB and 2GB 7850s are nearly identical outside of Skyrim.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Chipset Drivers: Intel 9.​2.​3.​1022
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards: AMD Radeon HD 5770
AMD Radeon HD 6850
AMD Radeon HD 7770
AMD Radeon HD 7850 1GB
AMD Radeon HD 7850 2GB
NVIDIA GeForce GTS 450
NVIDIA GeForce GTX 550 Ti
NVIDIA GeForce GTX 560
NVIDIA GeForce GTX 650
NVIDIA GeForce GTX 650 Ti
NVIDIA GeForce GTX 660
Video Drivers: NVIDIA ForceWare 305.37
NVIDIA ForceWare 306.23 Beta
NVIDIA ForceWare 306.38 Beta
AMD Catalyst 12.9 Beta
OS: Windows 7 Ultimate 64-bit

 

Meet The Gigabyte GeForce GTX 650 Ti OC 2GB Windforce Crysis: Warhead
Comments Locked

91 Comments

View All Comments

  • Hades16x - Tuesday, October 9, 2012 - link

    A little bit saucy while reading this review on the page "Meet the Gigabyte Gefore GTX 650 TI OC 2GB Windforce" the second to last paragraph reads:

    "Rounding out the package is the usual collection of power adapters and a quick start guide. While it’s not included in the box or listed on the box, the Gigabyte GeForce GTX 660 Ti OC...."

    Shouldn't that read "the Gigabyte GeForce GTX 650 Ti OC" ?

    Thanks for the review Ryan!
  • Hrel - Tuesday, October 9, 2012 - link

    First, card makers: If the card doesn't have FULL size HDMI, I won't even consider it. I get mini on smartphones, makes no damn sense on a GPU that goes into a 20lb desktop. Fuck everyone who does that. Second, Every display I own uses HDMI, most of them ONLY use HDMI. I want to see cards with 3 or 4 HDMI ports on them so I can run 3/4 displays without having to chain together a bunch of fucking adapters. HDMI or GTFO. I really don't understand why any other video cable even exists anymore, DVI is dumb and old, VGA, psh. Display Port? Never even seen it on a monitor/TV. I don't spend stupid amounts of money on stupid resolution displays where NO media is even produced at that resolution; but last I checked HDMI supports 8K video.

    Next: I bought my GTX460 for 130, or 135 bucks. This was a few months after it was released and with a rebate and weekend sale on newegg. Still, that card can MAX out every game I play at 1080p with no issues. I get that they're putting more RAM in the cards now, but that can't really justify more than a 10$ difference; of actual cost. I don't see the GTX660 EVER dropping down to 150 bucks or lower, WTF? Why is the GPU industry getting DRAMATICALLY more expensive and no one seems to be saying a thing? Remember the system RAM price fixing thing? Yeah, that sucked didn't it. I'd really hate to see that happen to GPU's.

    It's good to finally see a tangible improvement in performance in GPU's. From GT8800 to GTX560 improvements were very incremental; seems like an actual gain has been achieved beyond just generational improvements. Hoping consoles have at least 2GB of GDDR5 and at least 4GB of DDR3 system RAM for next gen. Seems like RAM is becoming much more important, based on Skyrim. With that said, I can buy 8GB of system ram for like 30 or 40 bucks. Puts actual cost at a few dollars. No reason at all these cards/consoles can't have shit tons of RAM all over the damn place. RAM is cheap, doesn't cost anything anymore. You can charge 10 bucks/4GB and still turn a stupid profit. Do the right thing Microsoft/Nvidia and everyone else; put shit tons of RAM in AT COST. Make money on the GPU/Console/Games.
  • maximumGPU - Wednesday, October 10, 2012 - link

    we should all be pushing and asking for royalty-free display ports!
    and just so you'd know quite a few high end monitors don't have hdmi, the dell ultrasharp U2312hm comes to mind.
    DP should be the standard.
  • Hrel - Thursday, October 11, 2012 - link

    DP doesn't support audio, as far as I know. Also offers no advantage at all for video. So why?
  • maximumGPU - Thursday, October 11, 2012 - link

    It does support audio!
    with all else being equal the fact that it's royalty free means it's preferable to hdmi.
  • TheJian - Wednesday, October 10, 2012 - link

    I'm not sure of physx in AC3, but yeah odd they put this in there. I would have figured a much cheaper game. When you factor in phsyx in games like Borderlands2 it changes the game quite a lot. You can interact with object in a way you can't on AMD:
    "One of the cool things about PhysX is that you can interact with these objects. In this screenshot we are firing a shot at the flag. The bullets go through the flag, causing it to blow a hole in the middle of it. After the actual flag tears apart, the entire string of flags fell down. This happens with flags and other cloth objects that are hanging around, the "Porta-John's" that are scattered across the world, blood and explosive objects. You can not destroy any of these objects without PhysX enabled on at least Medium. "
    http://hardocp.com/article/2012/10/01/borderlands_...

    I don't know why more sites don't talk about the physx stuff. I also like hardocp ALWAYS showing minimums as that is more important than anything else IMHO. I need to know a game is playable or not, not that it can hit 100fps here and there. Their graphs always show how LONG they stay low also. Much more useful info than a max fps shot in time (or even avg to me, I want min numbers). Anandtech only puts mins in where it makes an AMD look good it seems. Not sure other than that why they wouldn't include them in EVERY game with a graph like hardocp showing how long their there. If you read hardocp it's because they dip a lot, but maybe I'm just a cynic. At least they brought back SC2 :) Cuda is even starting to be used in games like just cause 2 (for water).
    http://www.geforce.com/games-applications/pc-games...
    Interesting :)
  • jtenorj - Wednesday, October 10, 2012 - link

    You can run medium physx on a radeon without much loss of performance.
  • Magnus101 - Wednesday, October 10, 2012 - link

    Why suddenly the race for 60 FPS?
    It used to be 30 FPS average and minimums not going under 18 in Crysis that was considered good.
    Movies are at 24 FPS and stuttering isn't recognisable until you hit 16-17 FPS.
    Pal TV in Europe was at 25 FPS.

    It looks like everybody is buying into Carmacks 60 FPS mantra, which is insane.
    For me minimums above 20 FPS is enough for a game to be perfectly playable.
    This is the snobby debate with audiphiles all over again where they swear they can tell the difference between 96 and 44.1 khz, just substitue the samplerate with FPS.

    But I guess the Nvidia and ATI are happy that you for no reason just raise the bar of acceptance!
  • ionis - Wednesday, October 10, 2012 - link

    60 FPS has been the target for the past 3-4 years. I'm happy with 25-30 but this min 30 ave 60 FPS target has been going on for quite a while now.
  • CeriseCogburn - Friday, October 12, 2012 - link

    You'll be happy until you play the same games on a cranked SB system with a high end capable videocard and an SSD (on a good low ping connection if multi).

    Until then you have no idea what you are missing. You're telling yourself there isn't more, but there is a lot, lot more.
    Quality
    Fluidity
    Immersion
    Perception in game
    Precision
    Timing
    CONTROL of your game.

    Yes it is snobby to anyone lower than whatever the snob build is - well, sort of, because the price to get there is not much at all really.

    You may not need it, you may "be fine" with what you have, but there is exactly zero chance there is isn't a huge, huge difference.

Log in

Don't have an account? Sign up now