Power, Temperatures, & Noise

Last, but not least of course, is our look at power, temperatures, and noise levels. While a high performing card is good in its own right, an excellent card can deliver great performance while also keeping power consumption and the resulting noise levels in check.

NVIDIA GeForce Video Card Voltages
Model Boost Idle
EVGA GTX 1660 Super Ultra SC 1.05v 0.618v
GeForce GTX 1660 1.043v 0.656v
GeForce GTX 1660 Ti 1.005v 0.65v

Using the same TU116 GPU as the GTX 1660 Ti and the GTX 1660 (vanilla0, the voltages are unsurprisingly similar. 1.05v is essentially a universal limit for NVIDIA Turing GPUs at stock, while the idle voltage of 0.618v is a bit lower than what we’ve seen on other TU116 cards thus far.

GeForce Video Card Average Clockspeeds
Game GTX 1660 Super
(Ref Clocks)
EVGA
GTX 1660 Super SC Ultra
GTX 1660 Ti GTX 1660
Max Boost Clock 1935MHz 1980MHz 1950MHz 1935MHz
Boost Clock 1785MHz 1830MHz 1680MHz 1785MHz
Shadow of the Tomb Raider 1860MHz 1905MHz 1875MHz 1875MHz
F1 2019 1860MHz 1905MHz 1890MHz 1875MHz
Assassion's Creed: Odyssey 1875MHz 1920MHz 1905MHz 1890MHz
Metro: Exodus 1860MHz 1905MHz 1890MHz 1875MHz
Strange Brigade 1860MHz 1905MHz 1890MHz 1890MHz
Total War: Three Kingdoms 1860MHz 1890MHz 1890MHz 1875MHz
The Division 2 1845MHz 1875MHz 1875MHz 1860MHz
Grand Theft Auto V 1875MHz 1920MHz 1905MHz 1890MHz
Forza Horizon 4 1875MHz 1905MHz 1905MHz 1890MHz

The situation with clockspeeds is also very similar, though not entirely a carbon copy of the GTX 1660 (vanilla). Even with NVIDIA’s slightly higher TDP, our GTX 1660 Super card sees a very slight drop in clockspeeds, typically coming in one bin (15MHz, or under-1%) below the original card. In this case it’s a tradeoff we’re glad to take, since as we’ve just seen, the extra memory bandwidth on the GTX 1660 Super more than makes up for any clockspeed deficit, launching the Super card well ahead of its GDDR5-based predecessor.

In any case, the GTX 1660 Super once again comes in well ahead of NVIDIA’s official boost clock specifications. Even under The Division 2, average clockspeeds beat the spec by 60MHz, and in other games It’s more frequently 75 to 90MHz above.

Idle Power Consumption

Load Power Consumption - Shadow of the Tomb Raider

Load Power Consumption - FurMark

Shifting to power consumption, our results are in line with NVIDIA’s specifications, as well as what we’d expect for yet another TU116 card. With its 125W TGP, the GTX 1660 Super draws ever so slightly more power than either the GTX 1660 Ti or the GTX 1660, particularly in Tomb Raider where the CPU gets a bit more of a workout as well. But on the whole, it’s right in the ballpark with other 120W(ish) NVIDIA cards, with power consumption at the wall for the entire testbed not exceeding 200W.

For the midrange segment, the GTX 1660 Super (and the GTX 1660 Ti) are the cards to beat when it comes to power consumption and efficiency. Everything else at this power level performs much slower, or it’s faster while requiring more power. Though faster cards aren’t too far off, as the RX 5700 can attest to.

Idle GPU Temperature

Load GPU Temperature - Shadow of the Tomb Raider

Load GPU Temperature - FurMark

As for temperatures, EVGA has delivered one cool running card. Even at its full, factory overclocked speeds, the EVGA GTX 1660 Super SC Ultra never cracks 70C, and under FurMark’s pathological workload it’s the second-quietest card in these cards.

Idle Noise Levels

Load Noise Levels - Shadow of the Tomb Raider

Load Noise Levels - FurMark

The tradeoff for those temperatures, however, is noise. The EVGA card that delivers chart-topping temperatures also delivers some of the worst noise results among this collection of cards.

The culprit here would seem to be EVGA’s decision to bias the card towards cooling performance rather than acoustics. Which given how far the card is from its 83C thermal throttle point, seems overdone. EVGA could easily back off on the fan speed a bit, let the temperatures drift up to the low 70s, and deliver essentially the same gaming performance (perhaps losing 1 bin in the process) while generating a lot less noise. We have a number of 120W open air cards in these cards, including the GTX 1660 3GB and GTX 960, both of which move just as much heat with much less noise, so it can be done. And, to be fair to EVGA here, their SC Ultra card is by no means a tornado, barely hitting 50 dB(A) in these intensive, open case tests, but the best cards strike a proper balance between noise and performance, maximizing the latter while minimizing the former.

Ultimately, I suspect part of the engineering challenge EVGA is dealing with here is that the SC Ultra cooler is their smallest GTX 1660 cooler. The triple-slot XC cards (represented here with the GTX 1660 Ti and GTX 1660) have just one fan and much bigger heatsinks to work with. Similarly, EVGA also sells longer dual-slot cards (also called XC) which get the benefit of a longer heatsink. The physics of more heatsink mass (and more/bigger fins) can’t be ignored, which is why smaller cards often need to run faster fans. Still, even if the SC Ultra cooler isn’t particularly big, I do think there’s room for a better fan balance here.

Tangentially, as I mentioned in the EVGA SC Ultra overview, this is actually our second card. The original was even hotter and louder; it reached 75C and 54.6 dB(A) under Tomb Raider in that test. Considering that these GTX 1660 Super cards are operating near or at their power limits and are TDP-capped by the VRMs and monitoring hardware – and thus, one card can’t draw significantly more power than another identical card – it points to a cooling problem with the card itself. EVGA has since taken the card back to figure out what’s going on, but I suspect what they’ll find is poor thermal transfer between the GPU and heatsink, perhaps due to a bad TIM application or a problem with the heatpipes. Ultimately it’s rare that we get dud video cards, but it does happen now and then.

Synthetics Final Words
Comments Locked

65 Comments

View All Comments

  • eastcoast_pete - Wednesday, October 30, 2019 - link

    Thanks Ryan, and sorry, the 1660 was already "all Turing", so my question was redundant. I meant to ask about the 1650 Super. If that GPU remains unchanged, it still is a cut Turing GPU with Volta NVENC.
  • timecop1818 - Wednesday, October 30, 2019 - link

    Actually 1660 (not super) already has the Turing NVDEC/NVENC, because it's the first card which can handle 8K60P decode with ~70% NVDEC utilization. On 1080/1080Ti (Pascal) this runs at around 40fps and 100% utilization.

    Reference: https://developer.nvidia.com/video-encode-decode-g...
  • timecop1818 - Wednesday, October 30, 2019 - link

    I'm surprised nobody said "Fuck DVI" yet.
    At least about 1/3 of the AIB makers finally dumped that retardo connector.
    I bought a gigabyte? or something GTX1660 and it was finally a proper card with 3x DP and 1x HDMI.
  • Korguz - Wednesday, October 30, 2019 - link

    considering monitors are STILL made with vga... thats what they should stop making before they drop dvi...
  • Gastec - Wednesday, October 30, 2019 - link

    And monitors could still be made with VGA connectors for 50 more years to come and the U.S. Military just dumped the floppy disk from their nuclear missiles controls.
    From my of experience, both a work and in private life, this lack of knowledge and desire to upgrade is not an exception but the rule. I have a friend that just turned 30, he knows every social networking trick and settings for his smartphone but connects his laptop to a small monitor via the bundled VGA cable that came in the box. He didn't even know the monitor had a DVI port, or what that even is.
  • MaikelSZ - Wednesday, October 30, 2019 - link

    My monitors has VGA, DVI and HDMI. To this day, all the HDMI connections that I have used on PCs and TVs have given me problems of some kind.
    An interesting problem that I have seen 3 diferent times in 3 diferente places was that in one position the cable gives image problems (small distortions) or in others even the TV loses the image for a couple of seconds every so often. If the cable was reversed, the problem disappeared.

    My graphics card has 3 DP and 1 HDMI and I use a DP-DVI converter for my monitor, I don't use the HDMI. I only use HDMI when I connect 2 monitors, one HDMI and the other using DVI
  • grazapin - Wednesday, November 6, 2019 - link

    That sounds like a bad HDMI cable. More specifically, one end of the cable is bad and has intermittent connection problems. When you plug the bad end into the laptop it will flake out because the cable is more likely to be bumped or jiggled and the cable is likely bending to the side and pulling on the connector. When you plug the bad end into the monitor the cable is more likely to be straight so it's not putting stress on the connection and it's not get jostled after you plugged it in, and the good end is out where the jostling occurs. Replace that HDMI cable and I bet your problems go away.
  • Nirman04 - Wednesday, October 30, 2019 - link

    It will be interesting to see the effect this has on the market. If a 1660 Super is only $10 more than the "vanilla" 1660 and yet performs closer to the Ti card which is $60 more than the 1660, I can't see anyone buying the1660 now, never mind the Ti. Clearly a lot will come down to the pricing from a individual manufacturers who could now cut the price of the 1660 and 1650, but it looks like there is now competition even between Nvidia cards, let alone competition with AMD.
  • Larry Litmanen - Friday, November 1, 2019 - link

    To me it is wait and see, what if Stadia works.

    Why spend money on something that will not run any games in 2 years.

    My gtx 960 can run games only on lowest settings on a Dell U3415w, it basically stopped running new games around 2017.

    The card works it's just it's not powerful enough, frankly I have no desire to spend $220 on something that is useless in 2 years.
  • dromoxen - Friday, November 8, 2019 - link

    Too fast development of new gfx cards renders the older cards redundant too soon. I have gtx960 with 4gb for futureproof *sigh* . This HAS to stop.

Log in

Don't have an account? Sign up now