Power, Temperature, & Noise

Last but not least of course is our look at power, temperature, and noise. Because the GTX 650 Ti Boost is essentially a reconfigured GTX 660 there aren’t going to be any grand revelations here, but it will be useful to figure out just what the real-world power savings will be from fusing off that one SMX.

GeForce GTX 650 Ti Boost Voltages
Ref GTX 650 Ti Boost Idle Ref GTX 650 Ti Boost Load Ref GTX 660 Load
0.887v 1.175v 1.175v

With a peak load voltage of 1.175v, the GTX 650 Ti Boost tops out at the same voltage as the rest of the boost-enabled Kepler family.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because GTX 650 Ti Boost has the same TDP has the GTX 660 but at least marginally lower power consumption due to the disabled SMX, it’s in an interesting position where it has more headroom for boosting than its fully-enabled counterpart. As a result we’re seeing far less variability than what we saw with the GTX 660 when we reviewed it last year. With the exception of BF3, every game is sustained at the top boost bin of 1071MHz. Based on these results it would appear that in practice the GTX 650 Ti Boost operates at a marginally higher average clockspeed than the otherwise superior GTX 660.

GeForce GTX 650 Ti Boost Average Clockspeeds
Max Boost Clock 1071MHz
DiRT:S
1071MHz
Shogun 2
1071MHz
Hitman
1071MHz
Sleeping Dogs
1071MHz
Crysis
1071MHz
Far Cry 3
1071MHz
Battlefield 3
1058MHz
Civilization V
1071MHz
FurMark 992MHz

Idle Power Consumption

Starting as always with idle power, 110W at idle is par for the course for most NVIDIA cards. With the reactivation of the 3rd ROP partition the GTX 650 Ti Boost gives up the slight advantage the GTX 650 Ti gained here.

Load Power Consumption - Battlefield 3

Moving on to our first and arguably most important load power test we have BF3. Power consumption in BF3 scales slightly with performance due to the extra work required of the CPU to feed more frames to a video card, but it usually paints some clear trends and this is no exception. NVIDIA may only be giving the GTX 650 Ti Boost a official TDP 6W lower than the GTX 660, but it’s clear power consumption is at least a little bit lower than that; we’re seeing 289W at the wall versus 298W for the GTX 660. On the other hand these results are 28W higher than the GTX 650 Ti, and even 18W higher than the 7850, never mind the 7790. The GTX 650 Ti Boost’s performance here is well ahead of everything other than the GTX 660, so this jump in power consumption does come with a performance boost, but it serves as a reminder that there is a tradeoff to be made. In the case of the GTX 650 Ti Boost, we’re seeing it pull away a bit from the efficiency curve set by NVIDIA’s other products.

Load Power Consumption - FurMark

As for power consumption under FurMark, we’re seeing the more muted results for the GTX 650 Ti Boost. Here it’s roughly halfway between the GTX 650 Ti and GTX 660, and probably should be a bit higher still. The fact that there’s a difference (at the wall) of 27W between the GTX 660 and GTX 650 Ti Boost is more than we would except, and more than makes sense for cards that are identical except for a single fused off SMX. So we may also be seeing chip-to-chip variation play a part here. In any case, power consumption is also similar to the 7850, but this is one of those scenarios where we put more faith in the BF3 numbers than the FurMark numbers; NVIDIA appears to just outright be more aggressive on throttling here.

Idle GPU Temperature

Utilizing the same cooler as the GTX 660, there’s no surprise here in seeing the GTX 650 Ti Boost hit the same 30C idle temperatures.

Load GPU Temperature - Battlefield 3

Similarly, our temperature results here closely parallel the GTX 660’s under load. The GTX 650 Ti Boost consumes a bit less power than the GTX 660, and doesn’t get quite as warm as a result. The large jump from the GTX 650 Ti comes as a bit of a shock to the eyes at first, but as the GTX 650 Ti Boost is a blower and one with a conservative fan curve at that, this is to be expected.

Load GPU Temperature - FurMark

With FurMark we see temperatures go up, but for the most part things are consistent with what we saw under BF3. The larger gap between the GTX 650 Ti Boost and GTX 660 reflects the larger gap in power consumption we saw earlier.

Idle Noise Levels

Just as with idle temperatures, the same GPU on the same cooler means we’re looking at the same idle noise temperatures. The blower NVIDIA uses here is quite good, but it can’t compete with simple open-air coolers like the 7790’s.

Load Noise Levels - Battlefield 3

There’s a clear cutoff here between the open-air coolers and the blowers. The GTX 650 Ti Boost is quite a bit faster than something like the 7790 here, but the difference in noise is remarkable. Some of NVIDIA’s partner cards should fare much better here as they pack open-air coolers, with the usual tradeoff of giving up being fully-exhausting coolers. Still, this is a reminder that the GTX 650 Ti Boost pulls back from the efficiency curve a bit; it’s about 15% slower than the GTX 660 but no quieter for it.

Load Noise Levels - FurMark

Finally we have load noise under FurMark. NVIDIA’s more aggressive throttling here means that our results don’t jump up to much from BF3. The GTX 650 Ti Boost does finally end up being a bit quieter than the GTX 660 due to the former’s lower power consumption, and this is the only time we see the GTX 650 Ti Boost gain an edge on the 7850 in power/temp/noise.

Synthetics Final Words
Comments Locked

78 Comments

View All Comments

  • piroroadkill - Tuesday, March 26, 2013 - link

    1GiB 7790s are about the same price here as 1GiB 7850s - no joke, for example:
    http://www.ebuyer.com/492110-asus-hd-radeon-hd-779...
    http://www.ebuyer.com/393396-asus-hd-7850-1gb-gddr...

    So what's the point? Save a bit more money, get a 7850 2GB and overclock the balls off it...
  • HighTech4US - Tuesday, March 26, 2013 - link

    7850's 1GB are going EOL so if you want one better grab it quick.

    http://www.fudzilla.com/home/item/30865-radeon-hd-...
  • piroroadkill - Tuesday, March 26, 2013 - link

    That makes sense, it replaces that part. In that case, you're getting screwed at that price point, and you should pick up a 7850 instead as soon as possible.

    Myself, I don't need an upgrade yet, my 6950 2GiB with unlocked shaders is fine..
  • chizow - Tuesday, March 26, 2013 - link

    This is probably the first Kepler part Nvidia has launched so far that actually comes off looking like a good value. It's probably where price:performance should've been a year ago, but it has taken nearly a full year for 28nm prices to trickle down to this point. Still, it's pretty amazing how much Nvidia has milked Kepler. They now have 7-8 SKUs (not counting OC variants) in this sub-$300 market based off of 3 ASICs (GK104, GK106, GK107). Reminds me of that Mickey Mouse cartoon where they keep slicing off razor thin pieces of bean. At least this part makes sense however and fills a pretty cavernous void in that $150-$200 range between the 660 and old 650Ti.

    Valid point to be made however about the huge disparity in gaming bundles. AMD really is kicking Nvidia's teeth in with their gaming bundles of late. Nvidia's F2P bundle stinks compared to AMD's recent offerings of Crysis 3, Bioshock Infinity, Tomb Raider etc. In a $150-200 market where one can easily account for 1/3 to 1/4th of the sticker price as a hot AAA game, the perceived bundle value does matter. I'm sure it helped the 650Ti with AC3, but that card was a bit underperforming relative to even last-gen cards. The cards in the $150+ range are much better performers, actually providing tangible upgrades from most last-gen parts in this range (GTX 560, 6850 etc).
  • Bob Todd - Tuesday, March 26, 2013 - link

    Spot on about the huge disparity in the game bundles. In the last two months I've picked up a 2GB 7850 and two 7870s. Without Never Settle Reloaded I honestly probably wouldn't have bought any of them. Sold two of the bundles and kept one.
  • HighTech4US - Tuesday, March 26, 2013 - link

    A number of problems with this review.

    #1 The latest Nvidia VHQL driver is 3.14.22 and was released yesterday. It shows improvements in Sleeping Dogs. So why is this review using an older 314.21 driver set?

    http://techreport.com/news/24560/new-geforce-drive...

    #2 Also the HD 7850 1GB is going EOL so why even do comparisons with a card that won't exist very soon.

    http://www.fudzilla.com/home/item/30865-radeon-hd-...
  • tfranzese - Tuesday, March 26, 2013 - link

    Concerning #1, are all fanboys this stupid? You do realize that a lot of work goes into these reviews and they're not done in a < 24 hour turnaround.
  • DanNeely - Tuesday, March 26, 2013 - link

    Re #2; even after being EOLed by the manufacturer the old models tend to linger in the channel for a while.
  • whyso - Tuesday, March 26, 2013 - link

    Good card compared to the 7790 but the 7850 2 gb is still a better buy. The two games you get with it and the fact that if you overclock the 7850 is going to eat the 650 ti boost (the 650 ti boost does not have much overclocking room at over 1050 mhz vs the 860 of the 7850). Competes much better in the low end (1 gb) than with the higher end.
  • Hrel - Tuesday, March 26, 2013 - link

    Ok, instead of just assuming Nvidia is evil.

    WHY didn't they just drop the price of the GTX660 to like 170 MSRP? I mean, if they're just fusing off part of the card, their cost is the same, if not higher due to whatever labor is involved in fusing off that SMX. This, IMO is a card that shouldn't even exist. The GTX660 is priced far too high for the performance offered. Random FPS hickups or no, all my recommendations are AMD until Nvidia stops pricing themselves out of competition. This, coming from someone who was, for a long time, Nvidia only ever since I had 3 horrid experiences with ATI in a row, back in the day.

Log in

Don't have an account? Sign up now