Final Thoughts

This current generation of video cards has been something of a rollercoaster ride in both performance and leadership. In the last 18 months we’ve seen AMD take the lead with Radeon HD 7970, unexpectedly lose it to GeForce GTX 680, gain it again with Radeon HD 7970 GE and greatly improved drivers, and then break even in the end with GTX 770. GTX 780 and GTX Titan make all of this moot with their much greater single-GPU performance, but priced as they are they’re also nowhere near being in the same market segment as the GTX 770 and 7970GE.

In any case, more than anything else it strikes us as particularly funny that we’re once again looking at a tie. That’s right: on average GTX 770 and 7970GE are tied. GTX 770 delivers 102% of the performance of 7970GE at both our high quality 2560x1440 and high quality 1920x1080 settings. Of course as with some of the past battles between AMD and NVIDIA in this segment, these cards may be tied in our benchmarks but they’re anything but equal.

After all is said and done, the GTX 770 ends up beating the 7970GE at 6 games, while the 7970GE takes the other 4. Meanwhile within those individual games we’ll see anything between a near-tie to a very significant 20% advantage for either side, depending on the game in question. This is very much a repeat of what we saw with the GTX 680 versus the 7970GE, and GTX 670 versus the 7970.

Our advice then for prospective buyers is to first look at benchmarks for the games they intend to play. If you’re going to be focused on only a couple of games for the near future then there’s a very good chance one card or the other is going to be the best fit. Otherwise for gamers facing a wide selection of games or looking at future games where their performance is unknown, then the GTX 770 and 7970GE are in fact tied, and from a performance perspective you couldn’t go wrong with either one.

With that said, there are a couple of wildcard factors in play here that can tilt things in either side’s favor. At $399 the GTX 770 is cheaper than the 7970GE by $20 to $50, depending on the model and whether there’s a sale going on (the 7970 is actually priced closer, but we’d consider the 7970GE the better value for AMD cards). Consumers at virtually every level are still very price-conscious, so that’s going to put AMD in a pinch as they need 7970GE, not 7970 vanilla, to match GTX 770.

At the same time however given the fact that we’re looking at a performance tie AMD is making a very serious effort to offer more value than NVIDIA through their Level Up with Never Settle Reloaded gaming bundle. These bundles are non-tangible items – the value of which is solely in the eye of the beholder – but for a buyer interested in those games it’s going to be a very convincing argument. And then there’s compute performance and the amount of included RAM, both of which continue to favor AMD, though admittedly this is nothing new.

Meanwhile on a side note, it’s interesting to note that as evidenced by this launch that AMD has pushed NVIDIA to the point where NVIDIA has generally sacrificed their efficiency advantage to reach performance parity at a $400 price point. At the launch of the 7970GE NVIDIA at least tied the 250W 7970GE with a 195W GTX 680, giving NVIDIA an efficiency advantage. But now with the launch of the GTX 770 NVIDIA needs a 230W card to match that very same 250W 7970GE, a testament to AMD’s driver improvements and a reflection of the fact that just like AMD, NVIDIA needed to push a GPU to its limits to get here. There are still some edge cases here worth considering – you can’t get 7970GE on a blower for example – but under gaming workloads AMD and NVIDIA’s power consumption and heat generation have been equalized, making these cards more tied than ever before.

Ultimately a tie is a wonderful thing and a frustrating thing at the same time, and that’s definitely the case here with the launch of the GTX 770. The wonderful aspect of it is that NVIDIA and AMD are once again locked in vicious, brutal combat around the $400 price point. It has brought performance up and prices down in the middle of a generation, improving the options for all customers. The frustrating aspect on the other hand is that having a clear winner makes customers feel better as it removes any question about whether they’ve made the right choice. After all it’s much easier to make a choice when there’s really no choice to be made.

Moving on to some other comparisons, though we’ve focused mostly on the immediate competition, for those buyers on an upgrade cycle things have panned out pretty much as to be expected. The GTX 770 delivers an average performance improvement of 75% over the two-and-a-half year old GTX 570, which is roughly what we’d expect for jumping from one mid-generation card to another, and at $399 it is reasonably priced as an upgrade. The performance improvement from the GTX 670 is much smaller at just 20%, but GTX 770 is clearly not targeted at GTX 670 owners as an upgrade. At the same time it’s interesting to note that between the higher core clockspeed, higher memory clockspeed, and higher TDP plus GPU Boost 2.0 found on GTX 770, NVIDIA has improved their performance over GTX 680 by just 7% on average. This isn’t a lot in and of itself, but we’re talking about replacing a $450 video card with a $400 video card that’s faster across the board, so it’s a nice way to raise the bar on performance while bringing prices down.

Wrapping things up, this should set the stage for the enthusiast/high-end market for the rest of the year. According to AMD’s last schedule they won’t have a new high-end part to replace Tahiti until the end of the year, and NVIDIA won’t have Maxwell until 2014; all of this being complicated by the fact that TSMC’s 20nm process is still so far out. NVIDIA still has the rest of the GeForce 700 lineup to roll out through the next few months, but for the GTX 770 and the 7970/7970GE, the rest of the year will be a battle of prices and bundles.

Overclocking GTX 770
Comments Locked

117 Comments

View All Comments

  • Enkur - Thursday, May 30, 2013 - link

    Why is there a picture of Xbox One in the article when its mentioned nowhere.
  • Razorbak86 - Thursday, May 30, 2013 - link

    The 2GB Question & The Test

    "The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. The worst case scenario is only that these highest quality assets may not be usable at playable performance, but considering the high performance of every other aspect of GTX 770 that would be a distinct and unfortunate bottleneck."
  • kilkennycat - Thursday, May 30, 2013 - link

    NONE of the release offerings (May 30)of the GTX770 on Newegg have the Titan cooler !!!! Regardless of the pictures in this article and on the GTX7xx main page on Newegg. And no bundled software to "ease the pain" and perhaps help mentally deaden the fan noise..... this product takes more power than the GTX680. Early buyers beware... !!
  • geok1ng - Thursday, May 30, 2013 - link

    "Having 2GB of RAM doesn’t impose any real problems today, but I’m left to wonder for how much longer that’s going to be true. The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. "

    Last week a noob posted something like that on the 780 review. It was decimated by a slew of tech geeks comments afterward. I am surprised to see the same kind of reasoning on a text written by an AT expert.

    All AT reviewers by now know that next console will be using an APU from AMD that will have the graphic muscle (almost) comparable to a 6670 ( 5670 in PS4 case thanks to GDDR5) . So what Mr. Ryan Smith is stating is that a "8GB" 6670 can perform better than a 2GB 770 in video operations?

    I am well aware that Mr Ryan Smith is over-qualified to help AT readers revisit this old legend of graphics memory :
    How little is too little?

    And please let us not starting flaming about memory usage- most modern OSs and gaming engines use available RAM dinamically, so if one sees a game use 90%+ of available graphics memory does not imply , at all, that such game would run faster if we double the graphics memory. The opposite is often the true.

    As soon as 4GB versions of the 770 launch AT should pit these versions against the 2GB 770 and the 3GB 7970. Or we could go back months ago and re-read tests done when the 4GB versions of the 680 came out- only at triple screen resolutions and insane levels of AA would we see any theoretical advantage of 3-4Gb over 2GB, which is largely unpractical since most games can't run at these resolutions and AA with a single card anyway.

    I think NVDIA did it right (again): 2GB is enough for today and we wont see next gen consoles running triple screen resolutions at 16xAA+. 2Gb means less BoM, which is good for profit and price competition and less energy consumption which is good for card temps and max Oc results.
  • Enkur - Thursday, May 30, 2013 - link

    I cant believe AT is mixing up unified graphics and system memory on consoles with dedicated RAM of the graphics card. doesnt make sense.
  • Egg - Thursday, May 30, 2013 - link

    PS4 has 8GB of GDDR5 and a GPU somewhat close to a 7850. I don't know where you got your facts from.
  • geok1ng - Thursday, May 30, 2013 - link

    Just to start the flaming war- next consoles will not run in monolithic GPUs, but in twin jaguar cores. So when you see those 768/1152 GPU cores numbers, remember these are "crossfired" cores. And in both consoles the GPU is running at a mere 800Mhz, hence the comparison with the 5670/6670, 480 shaders cards@ 800Mhz.
    It is widely accepted that console games are developed using the lowest common denominator, in this case, the Xbox One DDR3 memory. Even if we take the huge assumption that dual jaguar cores running in tandem can work similar to a 7850 -1024 cores at 860Mhz- in a PS4 ( which is a huge leap of faith looking back to ho badly AMD fared in previous crossfires attempts using integrated GPU like these jaguar cores) that turns out to be the same:

    Do an 8GB 7850 gives us better graphical results than a 2GB 770, for any gaming application in the foreseeable future?

    Don't 4k on me please: both consoles will be using HDMI, not DisplayPort. and no, they wont be able to drive games across 3 screens. This "next gen-consoles will have more Video RAM than high GPUs in PCs, so their games will be better" is reminding of the old days of "1gb DDr2 cards are better than 256Mb DDr3 cards for future games" scam.
  • Ryan Smith - Thursday, May 30, 2013 - link

    We're aware of the difference. A good chunk of that unified memory is going to be consumed by the OS, the application, and other things that typically reside on the CPU in a PC. But we're still expecting games to be able to load 3GB+ in assets, which would be a problem for 2GB cards.
  • iEATu - Thursday, May 30, 2013 - link

    Why are you guys using FXAA in benchmarks as high end as these? Especially for games like BF3 where you have FPS over 100. 4x AA for 1080p and 2x for 1440p. No question those look better than FXAA...
  • Ryan Smith - Thursday, May 30, 2013 - link

    In BF3 we're testing both FXAA and MSAA. Otherwise most of our other tests are MSAA, except for Crysis 3 which is FXAA only for performance reasons.

Log in

Don't have an account? Sign up now