Power, Temperature, & Noise

Having seen the gaming performance of these cards and the limited gains from their factory overclocks, let’s move on to power, temperature, and noise. With their virtually identical clockspeeds and performance, if these cards are going to differentiate themselves it needs to be accomplished by factors the manufacturers have more control over, and their respective coolers are a good way to do it.

Idle Power Consumption

Load Power Consumption - Battlefield 3

Load Power Consumption - FurMark

Curiously, power consumption shows a much wider variation than we’d expect. The 22W at-the-wall difference under BF3 is the kind of gap we’d expect for cards of different product lines rather than two cards within the same product line. In this case Gigabyte’s power consumption is practically tied with our reference GTX 760, while the EVGA card is clearly drawing more power, which puts the EVGA card at a disadvantage here. Unfortunately it’s one that will have repercussions for temperatures and acoustics.

FurMark meanwhile flips the picture entirely. Now the Gigabyte 760OC is drawing more power than the EVGA card, on the order of 30W, and a full 72W more than the stock GTX 760. With FurMark being a fairly pure test of TDP and sustained loads, there appears to be two things going on. First and foremost, the significant cooling capacity of the Windforce cooler means that the card never temperature throttles and always reaches its TDP/voltage limits, even under FurMark. Second of all, since this is a recycled Gigabyte design intended for a 225W+ card, it would appear that the power target for this card is higher than 170W as in the card’s official specs.

The EVGA card, though drawing less power than the Gigabyte card, falls into a similar scenario. 42W over our reference GTX 760 at the wall is nothing to sneeze at. It would appear that EVGA’s design is similarly designed for a higher TDP, but lower than Gigabyte’s. Though with both cards being built for higher wattages, there’s also some room for overhead from the use of more complex power delivery systems than what we saw in the reference GTX 760.

Regardless, in these cases we’re more interested in gaming power consumption than FurMark power consumption, seeing as how we don’t expect Gigabyte’s higher power targets to be reached at stock due to the fact that the GTX 760 should run out of headroom in other areas first. As such Gigabyte looks to have a power consumption advantage where it will matter most.

Idle GPU Temperature

Load GPU Temperature - Battlefield 3

Load GPU Temperature - FurMark

Moving on to temperatures, we’re seeing a clear advantage for Gigabyte’s design here. The 760OC runs 6C cooler under BF3, and 7C cooler under FurMark. These temperatures mean that under FurMark both cards are easily below the 80C throttle point, but it’s still a clear victory for Gigabyte here. EVGA’s higher power consumption here isn’t doing them any favors, but unless there’s a radical difference in the fan curves – and as we’ll see there isn’t – this difference is going to be the result of both the higher power consumption and less effective cooling.

Idle Noise Levels

Load Noise Levels - Battlefield 3

Load Noise Levels - FurMark

Finally looking at our noise results, we once again are seeing a clear advantage for Gigabyte. Along with being cooler, their 760OC also ends up being quieter than EVGA’s 760SC under both BF3 and FurMark. The difference under BF3 ends up being 4.5dB, and even with Gigabyte’s higher power consumption under FurMark it’s still a 2.2dB advantage. EVGA’s performance is by no means poor for a 170W+ open air cooled card, but as we can see from these results Gigabyte’s 760OC is going to be noticeably quieter under the same workloads. Even idle noise is in Gigabyte’s favor by 2dB.

Ultimately with a sample size of one it’s not possible to isolate every factor, but based on our data we suspect we have a poorer than average GPU on the 760SC ACX from EVGA. The greater power consumption under BF3 and the higher temperatures point to a hotter than average chip while also ruling out fan curve differences as a factor. That said we can’t eliminate the 9.5” ACX cooler from the equation, and indeed that may play a part as well.

As a result of all of this, Gigabyte comes just short of a clean sweep in our power, temp, and noise testing. The only place where EVGA’s card does better is power consumption under load with FurMark, and that’s almost certainly down to power target differences between the two cards. In gaming scenarios power consumption should tread much closer to the BF3 results, and in those scenarios Gigabyte does better.

Gaming Performance Overclocking
Comments Locked

22 Comments

View All Comments

  • hags2k - Monday, October 7, 2013 - link

    I like the design of the GB card and really do think it's superior, but I've grown to really love EVGA's software package and have made use of their transferable warranty twice already - the "added value" really is value in this case. It's a tough call!
  • Subyman - Monday, October 7, 2013 - link

    I have a MSI TF 760 and couldn't be happier. I compared them all when they first came out and the MSI and GB were the quietest and coolest. I was very pleased with the quality especially considering the price. Would live to see the MSI represented.
  • Teizo - Monday, October 7, 2013 - link

    Not sure why you guys didn't include the MSI 760 Gaming, or the ASUS Direct CU. I guess you didn't have them on hand.
  • ShieTar - Tuesday, October 8, 2013 - link

    Since the 760 is showing consistently >80% of the performance of a 770, at 60% of the cost, and since NVIDIAs drivers seem to handle framepacing in SLI mode quiet well now, I would really love to see some performance tests for a set of 760s in SLI. Could you please add those tests?
  • Impulses - Tuesday, October 8, 2013 - link

    Yeah SLI 760s seem like a terrific value, faster than any single card by a good margin and cost effective enough as to make the 770 a bit irrelevant... I'm trying to decide whether to get two 760s, as an upgrade from my 6970 x2 setup, or save up for 780 x2...
  • Nfarce - Tuesday, October 8, 2013 - link

    Guru3D has just such an animal of a review dating from this past June. Just Google "GeForce GTX 760 SLI" and you'll see the link right up there at the top two or three links that come up. The only downside is they don't review all the games (no BF3, no Crysis 3, no Far Cry 3 specifically). But it beats the Titan in Tomb Raider & Bioshock Infinite by 13-18% at 2560x1440. Very nice bang for the buck.
  • idiot consumer - Sunday, October 13, 2013 - link

    It is nice to see new gear coming out BUT:

    There are probably 50% of nvidia card owners with famous: "video driver stopped responding and has recovered"

    There is no cure nor solution from nvidia or micro$oft.
    The only solution is to buy new card until it happens again.

    Forums all over the world are full of complaints.

    How come that mayor reviewers like AnandTech could not care less?
  • Galidou - Sunday, October 13, 2013 - link

    If they start to cover driver issues, both company wouldn't want their video card reviewed. The point here is to show the performance, not the possibility of various bugs/problems unless they're critical, BUT:

    I have a gtx 660ti for a year now and the problem has cursed me for a long time. It is/was worse in some games. I fixed part of it going to an earlier driver. I had a 6870 + 6850 in crossfire(I thought if I had any problem with crossfire I could disable it and play with the 6870) and never had a trouble with them, EVER.
  • idiot consumer - Sunday, October 13, 2013 - link

    If they start to cover driver issues, both company wouldn't want their video card reviewed. The point here is to show the performance, not the possibility of various bugs/problems unless they're critical,

    Considering that nvidia suffered class lawsuit - in US only - unfortunatelly and has settled it confirms that issues are critical.

    Good old days of excellent nvidia cards have gone forever.
    I shall never buy from nvidia anymore.
  • Ryan Smith - Monday, October 14, 2013 - link

    We report on things we see first hand and things we can reproduce. And right now we can't reproduce any NVIDIA driver stability issues (and not for a lack of trying).

Log in

Don't have an account? Sign up now