Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce GTX 750 Series Voltages
Ref GTX 750 Ti Boost Voltage Zotac GTX 750 Ti Boost Voltage Zotac GTX 750 Boost Voltage
1.168v 1.137v 1.187v

For those of you keeping track of voltages, you’ll find that the voltages for GM107 as used on the GTX 750 series is not significantly different from the voltages used on GK107. Since we’re looking at a chip that’s built on the same 28nm process as GK107, the voltages needed to drive it to hit the desired frequencies have not changed.

GeForce GTX 750 Series Average Clockspeeds
  Ref GTX 750 Ti Zotac GTX 750 Ti Zotac GTX 750
Max Boost Clock
1150MHz
1175MHz
1162MHz
Metro: LL
1150MHz
1172MHz
1162MHz
CoH2
1148MHz
1172MHz
1162MHz
Bioshock
1150MHz
1175MHz
1162MHz
Battlefield 4
1150MHz
1175MHz
1162MHz
Crysis 3
1149MHz
1174MHz
1162MHz
Crysis: Warhead
1150MHz
1175MHz
1162MHz
TW: Rome 2
1150MHz
1175MHz
1162MHz
Hitman
1150MHz
1175MHz
1162MHz
GRID 2
1150MHz
1175MHz
1162MHz
Furmark
1006MHz
1032MHz
1084MHz

Looking at average clockspeeds, we can see that our cards are essentially free to run at their maximum boost bins, well above their base clockspeed or even their official boost clockspeed. Because these cards operate at such a low TDP cooling is rendered a non-factor in our testbed setup, with all of these cards easily staying in the 60C or lower range, well below the 80C thermal throttle point that GPU Boost 2.0 uses.

As such they are limited only by TDP, which as we can see does make itself felt, but is not a meaningful limitation. Both GTX 750 Ti cards become TDP limited at times while gaming, but only for a refresh period or two, pulling the averages down just slightly. The Zotac GTX 750 on the other hand has no such problem (the power savings of losing an SMX), so it stays at 1162MHz throughout the entire run.

Idle Power Consumption

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Idle GPU Temperature

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

Idle Noise Levels

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Compute Overclocking: When Headroom Exceeds Clockspeed Limits
Comments Locked

177 Comments

View All Comments

  • Mondozai - Wednesday, February 19, 2014 - link

    Wait for 800 series budget cards if you have the patience. Hopefully no more than 4-5 months if TSMC does very well on 20.
  • Jeffrey Bosboom - Wednesday, February 19, 2014 - link

    I understand the absolute hashrate on these cards will be low, but I'm interested to know how the focus on power consumption improves mining performance per watt. (Though I can't imagine this lowish-end cards would be used, even if efficient, due to the fixed cost of motherboards to put them in.)
  • Antronman - Wednesday, February 19, 2014 - link

    Nvidia's best cards have tiny hash rates compared to 95% of every AMD GPU ever released.
  • JarredWalton - Wednesday, February 19, 2014 - link

    Apparently you're not up to speed on the latest developments. GTX 780 Ti as an example is now hitting about 700 KHash in scrypt, and word is the GTX 750 will be pretty competitive with 250-260 KHash at stock and much lower power consumption. Some people have actually put real effort into optimizing CUDAminer now, so while AMD still has an advantage, it's not nearly as large as it used to be. You could even make the argument that based on perf/watt in mining, some of NVIDIA's cards might even match AMD's top GPUs.
  • darthrevan13 - Wednesday, February 19, 2014 - link

    Why did they chose to retire 650 Ti Boost and replace it with 750Ti? 650 Ti B is a much better card for high end games because of the memory interface. They should have marketed 750Ti as 750 and 750 as 740.

    And why on earth did they not include full support for HEVEC and DX11.2? You're limiting the industry's adoption for years to come because of you're move. I hope they will fix this in the next generation 800 cards or when they will transition to 20nm.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Not speaking for NV here, but keep in mind that 650 Ti Boost is a cut-down GK106 chip. All things considered, 750 Ti will be significantly cheaper to produce for similar performance.

    NVIDIA really only needed it to counter Bonaire, and now that they have GM107 that's no longer the case.
  • FXi - Wednesday, February 19, 2014 - link

    No DX 11.2 or even 11.1 support? For THAT price??
    Pass...
  • rish95 - Wednesday, February 19, 2014 - link

    According to GeForce.com it supports 11.2. Not sure what's up with this:

    http://www.geforce.com/hardware/desktop-gpus/gefor...
  • willis936 - Wednesday, February 19, 2014 - link

    You don't need to be compliant to support something. Compliance means you meet all required criteria. Support means you can run it without having necessarily all the bells and whistles. If console hardware has DX compliance then the devs will take advantage of that and when they're ported you'll lose some of the neat graphics tricks. They might still be able to be done in software, you'll just need a bigger GPU to get the same frame rates :p Some things might not be able to be done in software though. Idk enough about DX to say.
  • sourav - Wednesday, February 19, 2014 - link

    does it will support on a pci v2?

Log in

Don't have an account? Sign up now