Meet The GeForce GTX 770

It’s unfortunate that none of NVIDIA’s North American partners will be selling cards based on NVIDIA’s reference design, since NVIDIA is once again using GTX Titan as their template for their design, making for a very high quality card. At the same time it’s unfortunate the reference design will not be available because it means that not everything we have to say about GTX 770 will be applicable to retail cards. We’re essentially reviewing a card with a unique cooler you can’t buy, which has been something of a recurring problem for us with these virtual launches.

In lieu of the reference design, NVIDIA’s partners will be going semi-custom right from the start. A lot of what we’re going to see are various 2 and 3 fan open air coolers, however at least a couple of partners will also be selling blowers, albeit plastic in place of the Titan-derived metal cooler. Still, blowers may be a bit hard to come by with GTX 670, which is something of an odd outcome given how prevalent blowers have been at this performance tier in the past.

In any case, we have a few different semi-custom GTX 770 cards that just arrived in-house (all of the overclocked variety) which we’ll be looking at next week. In the meantime let’s dive in to NVIDIA’s reference GTX 770.

Whereas GTX 780 was truly a Titan Mini, GTX 770 has a few more accommodations to account for the differences between the products, but the end product is still remarkably Titan-like. In short, GTX 770 is still a 10.5” long card composed of a cast aluminum housing, a nickel-tipped heatsink, an aluminum baseplate, and a vapor chamber providing heat transfer between the GPU and the heatsink. The end result is that NVIDIA maintains Titan’s excellent cooling performance while also maintaining Titan’s solid feel and eye-catching design.

The story is much the same on the PCB and component selection. The PCB itself is Titan’s PCB retrofitted for use with GK104 instead of GK110, which amounts to a handful of differences. Besides a new memory layout suitable for a 256bit bus operating at 7GHz, the other big change here is that NVIDIA has scaled down the power circuitry slightly, from a 6+2 phase design for their GK110 cards to a 5+1 phase design for GTX 770, in reflection of GTX 770’s lower 230W TDP.

On that note, for those of you looking for clean pictures of the PCB and GPU, unfortunately you will be out of luck as NVIDIA used the same silk-screened Shin-Etsu thermal compound as they did for GTX Titan. This compound is great for transferring heat and a great thing for GTX 770 buyers, but its composition and application means that we can’t take apart these cards without irrevocably damaging their cooling capabilities, and at the same time NVIDIA didn’t take pictures of their own.

Anyhow, with all of the similarities between GTX 770 and GTX 780/Titan, we are otherwise looking at a card that could be mistaken for Titan if not for the “GTX 770” stamped into the card’s shroud. This means that the I/O options are also identical, with a set of 8pin + 6pin power sockets providing the necessary extra power, a pair of SLI connectors allowing for up to 3-way SLI, and the NVIDIA standard display output configuration of 2x DL-DVI, 1x HDMI, 1x DisplayPort 1.2.

Like GTX 780, we expect to see some interesting designs come out of NVIDIA’s partners. The Titan cooler sets an extremely high bar here given the fact that it was designed for a higher 250W TDP, meaning it’s slightly overpowered for GTX 770. Meanwhile NVIDIA’s Greenlight approval program means that their partners semi-custom and custom designs need to maintain roughly the same level of quality, hence the common use of open-air coolers.

GeForce Clockspeed Bins
Clockspeed GTX 770 GTX 680
1149MHz 1.212v N/A
1136MHz 1.2v N/A
1123MHz 1.187v N/A
1110MHz 1.162v 1.175v
1097MHz 1.15v 1.15v
1084MHz 1.137v 1.137v
1071MHz 1.125v 1.125v
1058MHz 1.112v 1.125v
1045MHz 1.1v 1.112v

Moving on to overclocking, as this is a GPU Boost 2.0 part, overclocking will also operate in the same way it did on GTX 780, and yes, this includes overvolting. GTX 770’s maximum power target is 106% (244W), and a very mild overvoltage of +0.012v is available, unlocking one higher boost bin. This also means that GTX 770 follows the usual TDP and temperature throttling conditions, with a standard temperature throttle of 80C. In practice (at least on our reference card) GTX 770 typically reaches its highest clockspeeds before it reaches the TDP or temperature throttles, so these are mostly of use in concert with overvolting and the use of offset clocks.

Finally, GTX 770 also includes the incremental fan speed improvements first introduced last week with GTX 780. So like GTX 780, GTX 770’s default fan controller programming is biased to react more slowly to temperature changes in order to minimize sudden shifts in fan speed.

NVIDIA GeForce GTX 770 Review The 2GB Question & The Test
Comments Locked

117 Comments

View All Comments

  • Enkur - Thursday, May 30, 2013 - link

    Why is there a picture of Xbox One in the article when its mentioned nowhere.
  • Razorbak86 - Thursday, May 30, 2013 - link

    The 2GB Question & The Test

    "The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. The worst case scenario is only that these highest quality assets may not be usable at playable performance, but considering the high performance of every other aspect of GTX 770 that would be a distinct and unfortunate bottleneck."
  • kilkennycat - Thursday, May 30, 2013 - link

    NONE of the release offerings (May 30)of the GTX770 on Newegg have the Titan cooler !!!! Regardless of the pictures in this article and on the GTX7xx main page on Newegg. And no bundled software to "ease the pain" and perhaps help mentally deaden the fan noise..... this product takes more power than the GTX680. Early buyers beware... !!
  • geok1ng - Thursday, May 30, 2013 - link

    "Having 2GB of RAM doesn’t impose any real problems today, but I’m left to wonder for how much longer that’s going to be true. The wildcard in all of this will be the next-generation consoles, each of which packs 8GB of RAM, which is quite a lot of RAM for video operations even after everything else is accounted for. With most PC games being ports of console games, there’s a decent risk of 2GB cards being undersized when used with high resolutions and the highest quality art assets. "

    Last week a noob posted something like that on the 780 review. It was decimated by a slew of tech geeks comments afterward. I am surprised to see the same kind of reasoning on a text written by an AT expert.

    All AT reviewers by now know that next console will be using an APU from AMD that will have the graphic muscle (almost) comparable to a 6670 ( 5670 in PS4 case thanks to GDDR5) . So what Mr. Ryan Smith is stating is that a "8GB" 6670 can perform better than a 2GB 770 in video operations?

    I am well aware that Mr Ryan Smith is over-qualified to help AT readers revisit this old legend of graphics memory :
    How little is too little?

    And please let us not starting flaming about memory usage- most modern OSs and gaming engines use available RAM dinamically, so if one sees a game use 90%+ of available graphics memory does not imply , at all, that such game would run faster if we double the graphics memory. The opposite is often the true.

    As soon as 4GB versions of the 770 launch AT should pit these versions against the 2GB 770 and the 3GB 7970. Or we could go back months ago and re-read tests done when the 4GB versions of the 680 came out- only at triple screen resolutions and insane levels of AA would we see any theoretical advantage of 3-4Gb over 2GB, which is largely unpractical since most games can't run at these resolutions and AA with a single card anyway.

    I think NVDIA did it right (again): 2GB is enough for today and we wont see next gen consoles running triple screen resolutions at 16xAA+. 2Gb means less BoM, which is good for profit and price competition and less energy consumption which is good for card temps and max Oc results.
  • Enkur - Thursday, May 30, 2013 - link

    I cant believe AT is mixing up unified graphics and system memory on consoles with dedicated RAM of the graphics card. doesnt make sense.
  • Egg - Thursday, May 30, 2013 - link

    PS4 has 8GB of GDDR5 and a GPU somewhat close to a 7850. I don't know where you got your facts from.
  • geok1ng - Thursday, May 30, 2013 - link

    Just to start the flaming war- next consoles will not run in monolithic GPUs, but in twin jaguar cores. So when you see those 768/1152 GPU cores numbers, remember these are "crossfired" cores. And in both consoles the GPU is running at a mere 800Mhz, hence the comparison with the 5670/6670, 480 shaders cards@ 800Mhz.
    It is widely accepted that console games are developed using the lowest common denominator, in this case, the Xbox One DDR3 memory. Even if we take the huge assumption that dual jaguar cores running in tandem can work similar to a 7850 -1024 cores at 860Mhz- in a PS4 ( which is a huge leap of faith looking back to ho badly AMD fared in previous crossfires attempts using integrated GPU like these jaguar cores) that turns out to be the same:

    Do an 8GB 7850 gives us better graphical results than a 2GB 770, for any gaming application in the foreseeable future?

    Don't 4k on me please: both consoles will be using HDMI, not DisplayPort. and no, they wont be able to drive games across 3 screens. This "next gen-consoles will have more Video RAM than high GPUs in PCs, so their games will be better" is reminding of the old days of "1gb DDr2 cards are better than 256Mb DDr3 cards for future games" scam.
  • Ryan Smith - Thursday, May 30, 2013 - link

    We're aware of the difference. A good chunk of that unified memory is going to be consumed by the OS, the application, and other things that typically reside on the CPU in a PC. But we're still expecting games to be able to load 3GB+ in assets, which would be a problem for 2GB cards.
  • iEATu - Thursday, May 30, 2013 - link

    Why are you guys using FXAA in benchmarks as high end as these? Especially for games like BF3 where you have FPS over 100. 4x AA for 1080p and 2x for 1440p. No question those look better than FXAA...
  • Ryan Smith - Thursday, May 30, 2013 - link

    In BF3 we're testing both FXAA and MSAA. Otherwise most of our other tests are MSAA, except for Crysis 3 which is FXAA only for performance reasons.

Log in

Don't have an account? Sign up now