Power, Temperature, and Noise

As always, we'll take a look at power, temperature, and noise of the GTX 1660 Ti, though as a pure custom launch we aren't expecting anything out of the ordinary. As mentioned earlier, the XC Black board has already revealed itself in its RTX 2060 guise.

As this is a new GPU, we will quickly review the GeForce GTX 1660 Ti's stock voltages and clockspeeds as well.

NVIDIA GeForce Video Card Voltages
Model Boost Idle
GeForce GTX 1660 Ti 1.037V 0.656V
GeForce RTX 2060 1.025v 0.725v
GeForce GTX 1060 6GB 1.043v 0.625v

The voltages are naturally similar to the 16nm GTX 1060, and in comparison to pre-FinFET generations, these voltages are exceptionally lower because of the FinFET process used, something we went over in detail in our GTX 1080 and 1070 Founders Edition review. As we said then, the 16nm FinFET process requires said low voltages as opposed to previous planar nodes, so this can be limiting in scenarios where a lot of power and voltage are needed, i.e. high clockspeeds and overclocking. For Turing (along with Volta, Xavier, and NVSwitch), NVIDIA moved to 12nm "FFN" rather than 16nm, and capping the voltage at 1.063v.

GeForce Video Card Average Clockspeeds
Game GTX 1660 Ti EVGA
GTX 1660 Ti XC
RTX 2060 GTX 1060 6GB
Max Boost Clock
2160MHz
2160MHz
2160MHz
1898MHz
Boost Clock 1770MHz 1770MHz 1680MHz 1708MHz
Battlefield 1 1888MHz 1901MHz 1877MHz 1855MHz
Far Cry 5 1903MHz 1912MHz 1878MHz 1855MHz
Ashes: Escalation 1871MHz 1880MHz 1848MHz 1837MHz
Wolfenstein II 1825MHz 1861MHz 1796MHz 1835MHz
Final Fantasy XV 1855MHz 1882MHz 1843MHz 1850MHz
GTA V 1901MHz 1903MHz 1898MHz 1872MHz
Shadow of War 1860MHz 1880MHz 1832MHz 1861MHz
F1 2018 1877MHz 1884MHz 1866MHz 1865MHz
Total War: Warhammer II 1908MHz 1911MHz 1879MHz 1875MHz
FurMark 1594MHz 1655MHz 1565MHz 1626MHz

Looking at clockspeeds, a few things are clear. The obvious point is that the very similar results of the reference-clocked GTX 1660 Ti and EVGA GTX 1660 Ti XC are reflected in the virtually identical clockspeeds. The GeForce cards boost higher than the advertised boost clock, as is typically the case in our testing. All told, NVIDIA's formal estimates are still run a bit low, especially in our properly ventilated testing chassis, so we won't complain about the extra performance.

But on that note, it's interesting to see that while the GTX 1660 Ti should have a roughly 60MHz average boost advantage over the GTX 1060 6GB when going by the official specs, in practice the cards end up within half that span. Which hints that NVIDIA's official average boost clock is a little more correctly grounded here than with the GTX 1060.

Power Consumption

Idle Power Consumption

Load Power Consumption - Battlefield 1

Load Power Consumption - FurMark

Even though NVIDIA's video card prices for the xx60 cards have drifted up over the years, the same cannot be said for their power consumption. NVIDIA has set the reference specs for the card at 120W, and relative to their other cards this is exactly what we see. Looking at FurMark, our favorite pathological workload that's guaranteed to bring a video card to its maximum TDP, the GTX 960, GTX 1060, and GTX 1660 are all within 4 Watts of each other, exactly what we'd expect to see from the trio of 120W cards. It's only in Battlefield 1 do these cards pull apart in terms of total system load, and this is due to the greater CPU workload from the higher framerates afforded by the GTX 1660 Ti, rather than a difference at the card level itself.

Meanwhile when it comes to idle power consumption, the GTX 1660 Ti falls in line with everything else at 83W. With contemporary desktop cards, idle power has reached the point where nothing short of low-level testing can expose what these cards are drawing.

As for the EVGA card in its natural state, we see it draw almost 10W more on the dot. I'm actually a bit surprised to see this under Battlefield 1 as well since the framerate difference between it and the reference-clocked card is barely 1%, but as higher clockspeeds get increasingly expensive in terms of power consumption, it's not far-fetched to see a small power difference translate into an even smaller performance difference.

All told, NVIDIA has very good and very consistent power control here. and it remains one of their key advantages over AMD, and key strengths in keeping their OEM customers happy.

Temperature

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

Looking at temperatures, there are no big surprises here. EVGA seems to have tuned their card for high performance cooling, and as a result the large, 2.75-slot card reports some of the lowest numbers in our charts, including a 67C under FurMark when the card is capped at the reference spec GTX 1660 Ti's 120W limit.

Noise

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Turning again to EVGA's card, despite being a custom open air design, the GTX 1660 Ti XC Black doesn't come with 0db idle capabilties and features a single smaller but higher-RPM fan. The default fan curve puts the minimum at 33%, which is indicative that EVGA has tuned the card for cooling over acoustics. That's not an unreasonable tradeoff to make, but it's something I'd consider more appropriate for a factory overclocked card. For their reference-clocked XC card, EVGA could have very well gone with a less aggressive fan curve and still have easily maintained sub-80C temperatures while reducing their noise levels as well.

Compute & Synthetics Final Words
Comments Locked

157 Comments

View All Comments

  • Midwayman - Friday, February 22, 2019 - link

    I feel like they don't realize that until they improve the performance per $$$ there is very little reason to upgrade. I'm happy sitting on an older card until that changes. Though If I were on a lower end card I might be kicking myself for not just buying a better card years ago.
  • eva02langley - Friday, February 22, 2019 - link

    Since the bracket price moved up so much for relative performance at higher price point from the last generation, there is absolutely no reason for upgrading. That is different if you need a GPU.
  • zmatt - Friday, February 22, 2019 - link

    Agreed. It's kind of wild that I have to pay $350 to get on average 10fps better than my 980ti. If I want a real solid performance improvement I have to essentially pay the same today as when the 980ti was brand new. The 2070 is anywhere between $500-$600 right now depending on model and features. IIRC the 980ti was around $650. And according to Anantech's own benchmarks it gives on average 20fps better performance. That 2 generations, 5 years and I get 20fps for $50 less? No. I should have a 100% performance advantage for the same price by this point. Nvidia is milking us. I'm eyeballing it a bit here but the 2080Ti is a little bit over double the performance of a 980Ti. It should cost less than $700 to be a good deal.
  • Samus - Friday, February 22, 2019 - link

    I agree in that this card is a tough sell over a RTX2060. Most consumers are going to spend the extra $60-$70 for what is a faster, more well-rounded and future-proof card. If this were $100 cheaper it'd make some sense, but it isn't.
  • PeachNCream - Friday, February 22, 2019 - link

    I'm not so sure about the value prospects of the 2070. The banner feature, real-time ray tracing, is quite slow even on the most powerful Turing cards and doesn't offer much of a graphical improvement for the variety of costs involved (power and price mainly). That positions the 1660 as a potentially good selling graphics card AND endangers the adoption of said ray tracing such that it becomes a less appealing feature for game developers to implement. Why spend the cash on supporting a feature that reduces performance and isn't supported on the widest possible variety of potential game buyers' computers and why support it now when NVIDIA seems to have flinched and released the 1660 in a show of a lack of commitment? Already game studios have ditched SLI now that DX12 pushed support off GPU companies and into price-sensitive game publisher studios. We aren't even seeing the hyped up feature of SLI between a dGPU and iGPU that would have been an easy win on the average gaming laptop due in large part to cost sensitivity and risk aversion at the game studios (along with a healthy dose of "console first, PC second" prioritization FFS).
  • GreenReaper - Friday, February 22, 2019 - link

    What I think you're missing is that the DirectX rendering API set by Microsoft will be implemented by all parties sooner or later. It really *does* met a need which has been approximated in any number of ways previously. Next generation consoles are likely to have it as a feature, and if so all the AAA games for which it is relevant are likely to use it.

    Having said that, the benefit for this generation is . . . dubious. The first generation always sells at a premium, and having an exclusive even moreso; so unless you need the expanded RAM or other features that the higher-spec cards also provide, it's hard to justify paying it.
  • alfatekpt - Monday, February 25, 2019 - link

    I'm not sure about that. It is also an increase in thermals and power consumption that also costs money overtime. RTX advantage is basically null at that point unless you want to play at low FPS so 2060 advantage is 'merely' raw performance.

    For most people and current games 1160 already offers ultra great performance so not sure if people gonna shell out even more money for the 2060 since 1160 is already a tad expensive.

    1160 seems to be an awesome combination of performance and efficiency. Would it be better $50 lower? of course but why? since they don't have real competition from AMD...
  • Strunf - Friday, February 22, 2019 - link

    Why would nvidia give up of a market that costs them almost nothing ? if 5 years from now they do cloud gaming then they pretty much are still doing GPU.
    Anyways even in 5 years cloud gaming will still be a minor part of the GPU market.
  • MadManMark - Friday, February 22, 2019 - link

    "They are pushing prices up and up but that's not a long term strategy."

    That comment completely ignores the massive increase in value over both the RX 590 and Vega 56. Nividia produces a card that both makes the RX590 at the same pricepoint completely unjustifiable, and prompts AMD to cut the price of the Vega 56 in HALF overnight, and you are saying that it is *Nvidia* not *AMD* that is charging high prices?!?! I've always thought the AMD GPU fanatics who think AMD delivers more value were somewhat delusional, but this comment really takes the cake.
  • eddman - Saturday, February 23, 2019 - link

    It's not about AMD. The launch prices have clearly been increased compared to previous gen nvidia cards.

    Even this card is $30 more than the general $200-250 range.

Log in

Don't have an account? Sign up now