Power, Temperature, & Noise

Having seen the gaming performance of these cards and the limited gains from their factory overclocks, let’s move on to power, temperature, and noise. With their virtually identical clockspeeds and performance, if these cards are going to differentiate themselves it needs to be accomplished by factors the manufacturers have more control over, and their respective coolers are a good way to do it.

Idle Power Consumption

Load Power Consumption - Battlefield 3

Load Power Consumption - FurMark

Curiously, power consumption shows a much wider variation than we’d expect. The 22W at-the-wall difference under BF3 is the kind of gap we’d expect for cards of different product lines rather than two cards within the same product line. In this case Gigabyte’s power consumption is practically tied with our reference GTX 760, while the EVGA card is clearly drawing more power, which puts the EVGA card at a disadvantage here. Unfortunately it’s one that will have repercussions for temperatures and acoustics.

FurMark meanwhile flips the picture entirely. Now the Gigabyte 760OC is drawing more power than the EVGA card, on the order of 30W, and a full 72W more than the stock GTX 760. With FurMark being a fairly pure test of TDP and sustained loads, there appears to be two things going on. First and foremost, the significant cooling capacity of the Windforce cooler means that the card never temperature throttles and always reaches its TDP/voltage limits, even under FurMark. Second of all, since this is a recycled Gigabyte design intended for a 225W+ card, it would appear that the power target for this card is higher than 170W as in the card’s official specs.

The EVGA card, though drawing less power than the Gigabyte card, falls into a similar scenario. 42W over our reference GTX 760 at the wall is nothing to sneeze at. It would appear that EVGA’s design is similarly designed for a higher TDP, but lower than Gigabyte’s. Though with both cards being built for higher wattages, there’s also some room for overhead from the use of more complex power delivery systems than what we saw in the reference GTX 760.

Regardless, in these cases we’re more interested in gaming power consumption than FurMark power consumption, seeing as how we don’t expect Gigabyte’s higher power targets to be reached at stock due to the fact that the GTX 760 should run out of headroom in other areas first. As such Gigabyte looks to have a power consumption advantage where it will matter most.

Idle GPU Temperature

Load GPU Temperature - Battlefield 3

Load GPU Temperature - FurMark

Moving on to temperatures, we’re seeing a clear advantage for Gigabyte’s design here. The 760OC runs 6C cooler under BF3, and 7C cooler under FurMark. These temperatures mean that under FurMark both cards are easily below the 80C throttle point, but it’s still a clear victory for Gigabyte here. EVGA’s higher power consumption here isn’t doing them any favors, but unless there’s a radical difference in the fan curves – and as we’ll see there isn’t – this difference is going to be the result of both the higher power consumption and less effective cooling.

Idle Noise Levels

Load Noise Levels - Battlefield 3

Load Noise Levels - FurMark

Finally looking at our noise results, we once again are seeing a clear advantage for Gigabyte. Along with being cooler, their 760OC also ends up being quieter than EVGA’s 760SC under both BF3 and FurMark. The difference under BF3 ends up being 4.5dB, and even with Gigabyte’s higher power consumption under FurMark it’s still a 2.2dB advantage. EVGA’s performance is by no means poor for a 170W+ open air cooled card, but as we can see from these results Gigabyte’s 760OC is going to be noticeably quieter under the same workloads. Even idle noise is in Gigabyte’s favor by 2dB.

Ultimately with a sample size of one it’s not possible to isolate every factor, but based on our data we suspect we have a poorer than average GPU on the 760SC ACX from EVGA. The greater power consumption under BF3 and the higher temperatures point to a hotter than average chip while also ruling out fan curve differences as a factor. That said we can’t eliminate the 9.5” ACX cooler from the equation, and indeed that may play a part as well.

As a result of all of this, Gigabyte comes just short of a clean sweep in our power, temp, and noise testing. The only place where EVGA’s card does better is power consumption under load with FurMark, and that’s almost certainly down to power target differences between the two cards. In gaming scenarios power consumption should tread much closer to the BF3 results, and in those scenarios Gigabyte does better.

Gaming Performance Overclocking
Comments Locked

22 Comments

View All Comments

  • idiot consumer - Monday, October 14, 2013 - link

    Hi Ryan;
    I trust you in that regard. However you have to consider a large number of nvidia drivers failures:
    "video driver stopped responding and has recovered" is dreaded response by thousands.
    Now:
    "And right now we can't reproduce any NVIDIA driver stability issues (and not for a lack of trying)."
    response despite many forums full of frustrating users, is oil to the fire. This is response of nvidia and Micro$soft. After spending many hours collecting info I have found the way to cure the problem.
    Every case is different as it depends on programs installed, Micro$oft number and type of updates and user setting in nvidia control panel.
    That is why brand new installations are working fine - and then = after few months dreaded driver again....
    It cost me way too much time and frustration just to find that relation ie. windows7 and nvidia drivers is strange.
    They do not talk to each other......when they do new drivers or windows update online....
    As a result, customers are suffering.
    I have fixed my two systems, not a single fault after that, but not all customers have knowledge time and nerves to fix something that should never happen.
    Best regards to you - I do understand that you have to do what you do....
  • darrrio - Tuesday, August 5, 2014 - link

    i just bought the evga right before i saw this article.. now im afraid im not sure i did the right thing. the Gigabyte is in every war nicer than the evga... what stresses me most is the temperature :(( dont know if i should refund the card and get a Gigabyte..

Log in

Don't have an account? Sign up now