Meet The GeForce GTX 780, Cont

With all of that said, GTX 780 does make one notable deviation from GTX Titan. NVIDIA has changed their stock fan programming for GTX 780, essentially slowing down the fan response time to even out fluctuations in fan speeds. NVIDIA has told us that they’ve found that next to loud fans in general, the second most important factor in fan noise becoming noticeable is rapidly changing fan speeds, with the changing pitch and volume drawing attention to the card. Slowing down the response time in turn will in theory keep the fan speed from spiking so much, or quickly dropping (i.e. loading screen) only to have to immediately jump back up again.

In our experience fan response times haven’t been an issue with Titan or past NVIDIA cards, and we’d be hard pressed to tell the difference between GTX 780 and Titan. With that said there’s nothing to lose from this change, GTX 780 doesn’t seem to be in any way worse for it, so in our eyes there’s no reason for NVIDIA not to go ahead with the change.

On that note, since this is purely a software(BIOS) change, we asked NVIDIA about whether this could be backported to the hardware equivalent Titan. The answer is fundamentally yes, but because NVIDIA doesn’t have a backup BIOS system, they aren’t keen on using BIOS flashing any more than necessary. So an official (or even unofficial) update from NVIDIA is unlikely, but given the user community’s adept BIOS modding skills it’s always possible a 3rd party could accomplish this on their own.

Moving on, unlike Titan and GTX 690, NVIDIA will be allowing partners to customize GTX 780, making this the first line of GK110 cards to allow customization. Potential buyers that were for whatever reason disinterested in Titan due to its blower will find that NVIDIA’s partners are already putting together more traditional open air cooler coolers for GTX 780. We can’t share any data about them yet – today is all about the reference card – but we already have one such card in-hand with EVGA’s GeForce GTX 780 ACX.

The reference GTX 780 sets a very high bar in terms of build quality and performance, so it will be interesting to see what NVIDIA’s partners can come up with. With NVIDIA testing and approving all designs under their Greenlight program, all custom cards have to meet or beat NVIDIA’s reference card in factors such as noise and power delivery, which for GTX 780 will not be an easy feat.  However because of this requirement it means NVIDIA’s partners can deviate from NVIDIA’s reference design without buyers needing to be concerned that custom cards are significantly worse than then reference cards, something that benefits NVIDIA’s partners by their being able to attest to the quality of their products (“it got through Greenlight”), and benefitting buyers by letting them know they’re getting something that will be as good as the reference GTX 780, regardless of the specific make or model.

On that note, since we’re talking about card construction let’s quickly dive into overclocking. Overclocking is essentially unchanged from GTX Titan, especially since everything so far is using the reference PCB. The maximum power target remains at 106% (265W) and the maximum temperature target remains at 95C. Buyers will be able to adjust these as they please through Precision X and other tools, but no more than they already could on Titan, which means overclocking is fairly locked down.

Overvolting is also supported in a Titan-like manner, and once again is at the discretion of the card’s partner. By default GTX 780 has a maximum voltage of 1.1625v, with approved overvolting allowing the card to be pushed to 1.2v. This comes in the form of higher boost bins, so enabling overvolting is equivalent to unlocking a +13MHz bin and a +26MHz bin and their requisite voltages. However this also means that those voltages aren’t typically reached with overclocking and overvolting only has a minimal effect, as most overclocking attempts are going to hit TDP limits before they hit the unlocked boost bins.

GeForce Clockspeed Bins
Clockspeed GTX Titan GTX 780
1032MHz N/A 1.2v
1019MHz 1.2v 1.175v
1006MHz 1.175v 1.1625v
992MHz 1.1625v 1.15v
979MHz 1.15v 1.137v
966MHz 1.137v 1.125v
953MHz 1.125v 1.112v
940MHz 1.112v 1.1v
927MHz 1.1v 1.087v
914MHz 1.087v 1.075v

 

Meet The GeForce GTX 780 Software: GeForce Experience, Out of Beta
Comments Locked

155 Comments

View All Comments

  • mac2j - Thursday, May 23, 2013 - link

    The problem with $650 vs $500 for this price point is this:

    I can get 2 x 7950s for <$600 - that's a setup that destroys a 780 for less money.

    Even if you're single-GPU limited $250 is a lot of extra cash for a relative small amount of performance gain.
  • Ytterbium - Thursday, May 23, 2013 - link

    I'm disappointed they decided to cut the compute to 1/24 vs 1/3 in Titan, AMD is much better value for compute tasks.
  • BiffaZ - Friday, May 24, 2013 - link

    Except much consumer (@home type) compute is SP not DP so it won't make much difference. SP performance is around equal or higher than AMD's in 780.
  • Nighyal - Thursday, May 23, 2013 - link

    I don't know if this is possible but it would be great to see a benchmark that showed power, noise and temperature at a standard work load. We can get an inferred idea of clock per watt performance but when you're measuring a whole system other factors come into play (you mentioned CPU loads scaling with increased GPU performance).

    My interest in this comes from living in a hot climate (Australia) where a computer can throw out a very noticeable amount of heat. The large majority of my usage is light gaming (LoL) but I occasionally play quite demanding single player titles which stretches the legs of my GPU. The amount of heat thrown out is directly proportional to power draw so to be able to clearly see how many less watts a system requires for a controlled work load would be a handy comparison for me.

    TL:DR - Please also measure temperature, noise and power at a controlled workload to isolate clock per watt performance.
  • BiggieShady - Friday, May 24, 2013 - link

    Kudos on the FCAT and the delta percentages metrics. So 32,2% for 7990 means that on average one frame is present 32,2% more time than the next. Still, it is only an average. Great extra info would be to show same metrics that averages only the deltas higher then the threshold delta, and display it on the graph with varying thresholds.
  • flexy - Friday, May 24, 2013 - link

    NV releases a card with a ridiculous price point of $1000. Then they castrate the exact same card and give it a new name, making it look like it's a "new card" and sell it cheaper than their way overpriced high end card. Which, of course, is a "big deal" (sarcasm) given the crazy price of Titan. So or so, I don't like what NV does, in the slightest.

    Many ages ago, people could buy *real* top of the line cards which always cost about $400-$500, today you pay $600 for "trash cards" which didn't make it into production for Titan due to sub-par chips. Nvidia:"Hey, let's just make-up a new card and sell those chips too, lols"

    Please AMD, help us!!
  • bds71 - Friday, May 24, 2013 - link

    for what it's worth, I would have like to have seen the 780 *truly* fill the gap between the 680 and titan by offering not only the gaming performance, but ALSO the compute performance - if they would have done a 1/6 or even 1/12!! to better fill the gap and round out the performance all around I would HAPPILY pay 650 for this card. as it is, I already have a 690, so I will simply get another for 4k gaming - but a comparison between 3x 780's and 2 690's (both very close to $2k) at 8Mpixels+ resolution would be extremely interesting. note: 3x 30" monitors could easily be configured for 4800x2560 resolution via NVidia surround or eyefinity - and I, for one, would love to see THAT review!!
  • flexy - Friday, May 24, 2013 - link

    Well compute performance is the other thing, along with their questionable GPU throttle aka "boost" (yeah right) technology. Paying premium for such a card and then weak compute performance in exchange compared to older gen cards or the AMD offerings... Seriously, there is a lot to not like about Kepler, at least from an enthusiast point of view. I hope that NV doesn't continue that route in the future with their cards becoming less attractive while prices go up.
  • EJS1980 - Wednesday, May 29, 2013 - link

    Cynical much?
  • ChefJeff789 - Friday, May 24, 2013 - link

    Glad to see the significant upgrade. I just hope that AMD forces the prices back down again soon. I hope the AMD release "at the end of the year" is closer to September than December. It'll be interesting to see how they stack up. BTW, I have shied away from AMD cards ever since I owned an X800 and had SERIOUS issues with the catalyst drivers (constant blue-screens, had to do a Windows clean-install to even get the card working for longer than a few minutes). I know this was a long time ago, and I've heard from numerous people that they're better now. Is this true?

Log in

Don't have an account? Sign up now