Power, Temperature, and Noise

As always, we'll take a look at power, temperature, and noise of the GTX 1660 Ti, though as a pure custom launch we aren't expecting anything out of the ordinary. As mentioned earlier, the XC Black board has already revealed itself in its RTX 2060 guise.

As this is a new GPU, we will quickly review the GeForce GTX 1660 Ti's stock voltages and clockspeeds as well.

NVIDIA GeForce Video Card Voltages
Model Boost Idle
GeForce GTX 1660 Ti 1.037V 0.656V
GeForce RTX 2060 1.025v 0.725v
GeForce GTX 1060 6GB 1.043v 0.625v

The voltages are naturally similar to the 16nm GTX 1060, and in comparison to pre-FinFET generations, these voltages are exceptionally lower because of the FinFET process used, something we went over in detail in our GTX 1080 and 1070 Founders Edition review. As we said then, the 16nm FinFET process requires said low voltages as opposed to previous planar nodes, so this can be limiting in scenarios where a lot of power and voltage are needed, i.e. high clockspeeds and overclocking. For Turing (along with Volta, Xavier, and NVSwitch), NVIDIA moved to 12nm "FFN" rather than 16nm, and capping the voltage at 1.063v.

GeForce Video Card Average Clockspeeds
Game GTX 1660 Ti EVGA
GTX 1660 Ti XC
RTX 2060 GTX 1060 6GB
Max Boost Clock
2160MHz
2160MHz
2160MHz
1898MHz
Boost Clock 1770MHz 1770MHz 1680MHz 1708MHz
Battlefield 1 1888MHz 1901MHz 1877MHz 1855MHz
Far Cry 5 1903MHz 1912MHz 1878MHz 1855MHz
Ashes: Escalation 1871MHz 1880MHz 1848MHz 1837MHz
Wolfenstein II 1825MHz 1861MHz 1796MHz 1835MHz
Final Fantasy XV 1855MHz 1882MHz 1843MHz 1850MHz
GTA V 1901MHz 1903MHz 1898MHz 1872MHz
Shadow of War 1860MHz 1880MHz 1832MHz 1861MHz
F1 2018 1877MHz 1884MHz 1866MHz 1865MHz
Total War: Warhammer II 1908MHz 1911MHz 1879MHz 1875MHz
FurMark 1594MHz 1655MHz 1565MHz 1626MHz

Looking at clockspeeds, a few things are clear. The obvious point is that the very similar results of the reference-clocked GTX 1660 Ti and EVGA GTX 1660 Ti XC are reflected in the virtually identical clockspeeds. The GeForce cards boost higher than the advertised boost clock, as is typically the case in our testing. All told, NVIDIA's formal estimates are still run a bit low, especially in our properly ventilated testing chassis, so we won't complain about the extra performance.

But on that note, it's interesting to see that while the GTX 1660 Ti should have a roughly 60MHz average boost advantage over the GTX 1060 6GB when going by the official specs, in practice the cards end up within half that span. Which hints that NVIDIA's official average boost clock is a little more correctly grounded here than with the GTX 1060.

Power Consumption

Idle Power Consumption

Load Power Consumption - Battlefield 1

Load Power Consumption - FurMark

Even though NVIDIA's video card prices for the xx60 cards have drifted up over the years, the same cannot be said for their power consumption. NVIDIA has set the reference specs for the card at 120W, and relative to their other cards this is exactly what we see. Looking at FurMark, our favorite pathological workload that's guaranteed to bring a video card to its maximum TDP, the GTX 960, GTX 1060, and GTX 1660 are all within 4 Watts of each other, exactly what we'd expect to see from the trio of 120W cards. It's only in Battlefield 1 do these cards pull apart in terms of total system load, and this is due to the greater CPU workload from the higher framerates afforded by the GTX 1660 Ti, rather than a difference at the card level itself.

Meanwhile when it comes to idle power consumption, the GTX 1660 Ti falls in line with everything else at 83W. With contemporary desktop cards, idle power has reached the point where nothing short of low-level testing can expose what these cards are drawing.

As for the EVGA card in its natural state, we see it draw almost 10W more on the dot. I'm actually a bit surprised to see this under Battlefield 1 as well since the framerate difference between it and the reference-clocked card is barely 1%, but as higher clockspeeds get increasingly expensive in terms of power consumption, it's not far-fetched to see a small power difference translate into an even smaller performance difference.

All told, NVIDIA has very good and very consistent power control here. and it remains one of their key advantages over AMD, and key strengths in keeping their OEM customers happy.

Temperature

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

Looking at temperatures, there are no big surprises here. EVGA seems to have tuned their card for high performance cooling, and as a result the large, 2.75-slot card reports some of the lowest numbers in our charts, including a 67C under FurMark when the card is capped at the reference spec GTX 1660 Ti's 120W limit.

Noise

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Turning again to EVGA's card, despite being a custom open air design, the GTX 1660 Ti XC Black doesn't come with 0db idle capabilties and features a single smaller but higher-RPM fan. The default fan curve puts the minimum at 33%, which is indicative that EVGA has tuned the card for cooling over acoustics. That's not an unreasonable tradeoff to make, but it's something I'd consider more appropriate for a factory overclocked card. For their reference-clocked XC card, EVGA could have very well gone with a less aggressive fan curve and still have easily maintained sub-80C temperatures while reducing their noise levels as well.

Compute & Synthetics Final Words
POST A COMMENT

157 Comments

View All Comments

  • C'DaleRider - Friday, February 22, 2019 - link

    Good read. Thx. Reply
  • Opencg - Saturday, February 23, 2019 - link

    gtx at rtx prices. not really a fan of that graph at the end. I mean 1080 ti were about 500 about half a year ago. the perf/dollar is surely less than -7% more like -30%. as well due to the 36% perf gain quoted being inflated as hell. double the price and +20% perf is not -7% anand Reply
  • eddman - Saturday, February 23, 2019 - link

    They are comparing them based on their launch MSRP, which is fair.

    Actually, it seems they used the cut price of $500 for 1080 instead of the $600 launch MSRP. The perf/$ increases by ~15% if we use the latter, although it's still a pathetic generational improvement, considering 1080's perf/$ was ~55% better than 980.
    Reply
  • close - Saturday, February 23, 2019 - link

    In all fairness when comparing products from 2 different generations that are both still on the market you should compare on both launch price and current price. The purpose is to know which is the better choice these days. To know the historical launch prices and trends between generation is good for conformity but very few readers care about it for more than curiosity and theoretical comparisons. Reply
  • jjj - Friday, February 22, 2019 - link

    The 1060 has been in retail for 2.5 years so the perf gains offered here a lot less than what both Nvidia and AMD need to offer.
    They are pushing prices up and up but that's not a long term strategy.

    Then again, Nvidia doesn't care much about this market, they are shifting to server, auto and cloud gaming. In 5 years from now, they can afford to sell nothing in PC, unlike both AMD and Intel.
    Reply
  • jjj - Friday, February 22, 2019 - link

    A small correction here, there is no perf gain here at all, in terms of perf per dollar. Reply
  • D. Lister - Friday, February 22, 2019 - link

    Did you actually read the article before commenting on it? It is right there, on the last page - 21% increase in performance/dollar, which added with the very decent gain in performance/watt would suggest the company is anything but just sitting on their laurels. Unlike another company, which has been brute-forcing an architecture that is more than a decade old, and squandering their intellectual resources to design budget chips for consoles. :P Reply
  • shabby - Friday, February 22, 2019 - link

    We didn't wait 2.5 years for such a meager performance increase. Architecture performance increases were much higher before Turing, Nvidia is milking us, can't you see? Reply
  • Smell This - Friday, February 22, 2019 - link

    DING !
    I know it's my own bias, but branding looks like a typical, on-going 'bait-and-switch' scam whereby nVidia moves their goal posts by whim -- and adds yet another $100 in retail price (for the last 2 generations?). For those fans who spent beeg-buckeroos on a GTX 1070 (or even a 1060 6GB), it's The Way You Meant to Be 'Ewed-Scrayed.
    Reply
  • haukionkannel - Saturday, February 23, 2019 - link

    Do you remember how much cpus used to improve From generation to generation... 3-5%...
    That was when there was no competition. Now when there is competition we see 15% increase between generations or less. Well come to the future of GPUs. 3-5 % of increase between generations if there is not competition. Maybe 15 or less if there is competition. The good point is that you can keep the same gpu 6 year and you have no need to upgrade and lose money.
    Reply

Log in

Don't have an account? Sign up now