Power, Temperature, & Noise

Last but not least of course is our look at power, temperature, and noise. Because the GTX 650 Ti Boost is essentially a reconfigured GTX 660 there aren’t going to be any grand revelations here, but it will be useful to figure out just what the real-world power savings will be from fusing off that one SMX.

GeForce GTX 650 Ti Boost Voltages
Ref GTX 650 Ti Boost Idle Ref GTX 650 Ti Boost Load Ref GTX 660 Load
0.887v 1.175v 1.175v

With a peak load voltage of 1.175v, the GTX 650 Ti Boost tops out at the same voltage as the rest of the boost-enabled Kepler family.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because GTX 650 Ti Boost has the same TDP has the GTX 660 but at least marginally lower power consumption due to the disabled SMX, it’s in an interesting position where it has more headroom for boosting than its fully-enabled counterpart. As a result we’re seeing far less variability than what we saw with the GTX 660 when we reviewed it last year. With the exception of BF3, every game is sustained at the top boost bin of 1071MHz. Based on these results it would appear that in practice the GTX 650 Ti Boost operates at a marginally higher average clockspeed than the otherwise superior GTX 660.

GeForce GTX 650 Ti Boost Average Clockspeeds
Max Boost Clock 1071MHz
DiRT:S
1071MHz
Shogun 2
1071MHz
Hitman
1071MHz
Sleeping Dogs
1071MHz
Crysis
1071MHz
Far Cry 3
1071MHz
Battlefield 3
1058MHz
Civilization V
1071MHz
FurMark 992MHz

Idle Power Consumption

Starting as always with idle power, 110W at idle is par for the course for most NVIDIA cards. With the reactivation of the 3rd ROP partition the GTX 650 Ti Boost gives up the slight advantage the GTX 650 Ti gained here.

Load Power Consumption - Battlefield 3

Moving on to our first and arguably most important load power test we have BF3. Power consumption in BF3 scales slightly with performance due to the extra work required of the CPU to feed more frames to a video card, but it usually paints some clear trends and this is no exception. NVIDIA may only be giving the GTX 650 Ti Boost a official TDP 6W lower than the GTX 660, but it’s clear power consumption is at least a little bit lower than that; we’re seeing 289W at the wall versus 298W for the GTX 660. On the other hand these results are 28W higher than the GTX 650 Ti, and even 18W higher than the 7850, never mind the 7790. The GTX 650 Ti Boost’s performance here is well ahead of everything other than the GTX 660, so this jump in power consumption does come with a performance boost, but it serves as a reminder that there is a tradeoff to be made. In the case of the GTX 650 Ti Boost, we’re seeing it pull away a bit from the efficiency curve set by NVIDIA’s other products.

Load Power Consumption - FurMark

As for power consumption under FurMark, we’re seeing the more muted results for the GTX 650 Ti Boost. Here it’s roughly halfway between the GTX 650 Ti and GTX 660, and probably should be a bit higher still. The fact that there’s a difference (at the wall) of 27W between the GTX 660 and GTX 650 Ti Boost is more than we would except, and more than makes sense for cards that are identical except for a single fused off SMX. So we may also be seeing chip-to-chip variation play a part here. In any case, power consumption is also similar to the 7850, but this is one of those scenarios where we put more faith in the BF3 numbers than the FurMark numbers; NVIDIA appears to just outright be more aggressive on throttling here.

Idle GPU Temperature

Utilizing the same cooler as the GTX 660, there’s no surprise here in seeing the GTX 650 Ti Boost hit the same 30C idle temperatures.

Load GPU Temperature - Battlefield 3

Similarly, our temperature results here closely parallel the GTX 660’s under load. The GTX 650 Ti Boost consumes a bit less power than the GTX 660, and doesn’t get quite as warm as a result. The large jump from the GTX 650 Ti comes as a bit of a shock to the eyes at first, but as the GTX 650 Ti Boost is a blower and one with a conservative fan curve at that, this is to be expected.

Load GPU Temperature - FurMark

With FurMark we see temperatures go up, but for the most part things are consistent with what we saw under BF3. The larger gap between the GTX 650 Ti Boost and GTX 660 reflects the larger gap in power consumption we saw earlier.

Idle Noise Levels

Just as with idle temperatures, the same GPU on the same cooler means we’re looking at the same idle noise temperatures. The blower NVIDIA uses here is quite good, but it can’t compete with simple open-air coolers like the 7790’s.

Load Noise Levels - Battlefield 3

There’s a clear cutoff here between the open-air coolers and the blowers. The GTX 650 Ti Boost is quite a bit faster than something like the 7790 here, but the difference in noise is remarkable. Some of NVIDIA’s partner cards should fare much better here as they pack open-air coolers, with the usual tradeoff of giving up being fully-exhausting coolers. Still, this is a reminder that the GTX 650 Ti Boost pulls back from the efficiency curve a bit; it’s about 15% slower than the GTX 660 but no quieter for it.

Load Noise Levels - FurMark

Finally we have load noise under FurMark. NVIDIA’s more aggressive throttling here means that our results don’t jump up to much from BF3. The GTX 650 Ti Boost does finally end up being a bit quieter than the GTX 660 due to the former’s lower power consumption, and this is the only time we see the GTX 650 Ti Boost gain an edge on the 7850 in power/temp/noise.

Synthetics Final Words
Comments Locked

78 Comments

View All Comments

  • Hrel - Tuesday, March 26, 2013 - link

    More realistically, price the GTX660 at MOST 180. I can find 7850's for 160, from XFX no less.

    Anandtech, please add an "edit" function to you comments. Also, I want an email when someone responds to me. Then to be able to click a link that takes me directly to that comment, instead of having to plow through 100's of comments.
  • CiccioB - Tuesday, March 26, 2013 - link

    nvidia prices its solutions at the price it think they are best. If GTX660 sells like cookies, why on earth should they lower the price? To have a red quarter like AMD?
    And possibly they have quite a few GK106 with just some shaders dead but the memory controller completely working. So those pieces would have to be sold at GTX650 Ti price. With this move they can sell them with a bit of premium price.
    Consider that for nvidia this new board costed zero, while AMD had to forge a new chip, which has a cost.
  • Hrel - Tuesday, March 26, 2013 - link

    It'd be nice to Just Cause 2 in your benchmarks. It has a built in benchmark and everything. Awesome game people will be playing for years, considering they added multi-player. I know you're still working on the benchmark suite, so this is a suggestion I'd really like to see.
  • Hrel - Tuesday, March 26, 2013 - link

    nice to see*

    Holy Batman do you guys need to add an edit function to your comments.
  • aTonyAtlaw - Tuesday, March 26, 2013 - link

    I would posit that perhaps you need to proofread your comments more than Anandtech needs to provide an edit function.
  • skiboysteve - Tuesday, March 26, 2013 - link

    its good to keep in mind that open air coolers can be very loud if you dont have a well ventilated case like me. I have a 6850 with an open air cooler and the thing is VERY loud because it gets so crazy hot inside my case. If I had a blower on it, it wouldn't be nearly as loud
  • marc1000 - Tuesday, March 26, 2013 - link

    where is Starcraft II ? it's no longer part of the test suite?
  • Ryan Smith - Tuesday, March 26, 2013 - link

    Yes, it was removed. It gets rather silly on high-end cards these days, which is what we base our benchmark selections on.
  • Oxford Guy - Tuesday, March 26, 2013 - link

    How about Skyrim with the high resolution textures? I've heard that that requires 2 GB to run decently. That would be nice to see tested when the silly 1 GB card is released.
  • warezme - Tuesday, March 26, 2013 - link

    I think it would be interesting to also post mobile GPU numbers along with these cards. In this field of models there is some relevance related to how the two types would perform in similar games as a comparison.

Log in

Don't have an account? Sign up now