The NVIDIA GeForce GTX 1660 Ti Review, Feat. EVGA XC GAMING: Turing Sheds RTX for the Mainstream Market
by Ryan Smith & Nate Oh on February 22, 2019 9:00 AM ESTPower, Temperature, and Noise
As always, we'll take a look at power, temperature, and noise of the GTX 1660 Ti, though as a pure custom launch we aren't expecting anything out of the ordinary. As mentioned earlier, the XC Black board has already revealed itself in its RTX 2060 guise.
As this is a new GPU, we will quickly review the GeForce GTX 1660 Ti's stock voltages and clockspeeds as well.
NVIDIA GeForce Video Card Voltages | ||
Model | Boost | Idle |
GeForce GTX 1660 Ti | 1.037V | 0.656V |
GeForce RTX 2060 | 1.025v | 0.725v |
GeForce GTX 1060 6GB | 1.043v | 0.625v |
The voltages are naturally similar to the 16nm GTX 1060, and in comparison to pre-FinFET generations, these voltages are exceptionally lower because of the FinFET process used, something we went over in detail in our GTX 1080 and 1070 Founders Edition review. As we said then, the 16nm FinFET process requires said low voltages as opposed to previous planar nodes, so this can be limiting in scenarios where a lot of power and voltage are needed, i.e. high clockspeeds and overclocking. For Turing (along with Volta, Xavier, and NVSwitch), NVIDIA moved to 12nm "FFN" rather than 16nm, and capping the voltage at 1.063v.
GeForce Video Card Average Clockspeeds | |||||
Game | GTX 1660 Ti | EVGA GTX 1660 Ti XC |
RTX 2060 | GTX 1060 6GB | |
Max Boost Clock |
2160MHz
|
2160MHz |
2160MHz
|
1898MHz
|
|
Boost Clock | 1770MHz | 1770MHz | 1680MHz | 1708MHz | |
Battlefield 1 | 1888MHz | 1901MHz | 1877MHz | 1855MHz | |
Far Cry 5 | 1903MHz | 1912MHz | 1878MHz | 1855MHz | |
Ashes: Escalation | 1871MHz | 1880MHz | 1848MHz | 1837MHz | |
Wolfenstein II | 1825MHz | 1861MHz | 1796MHz | 1835MHz | |
Final Fantasy XV | 1855MHz | 1882MHz | 1843MHz | 1850MHz | |
GTA V | 1901MHz | 1903MHz | 1898MHz | 1872MHz | |
Shadow of War | 1860MHz | 1880MHz | 1832MHz | 1861MHz | |
F1 2018 | 1877MHz | 1884MHz | 1866MHz | 1865MHz | |
Total War: Warhammer II | 1908MHz | 1911MHz | 1879MHz | 1875MHz | |
FurMark | 1594MHz | 1655MHz | 1565MHz | 1626MHz |
Looking at clockspeeds, a few things are clear. The obvious point is that the very similar results of the reference-clocked GTX 1660 Ti and EVGA GTX 1660 Ti XC are reflected in the virtually identical clockspeeds. The GeForce cards boost higher than the advertised boost clock, as is typically the case in our testing. All told, NVIDIA's formal estimates are still run a bit low, especially in our properly ventilated testing chassis, so we won't complain about the extra performance.
But on that note, it's interesting to see that while the GTX 1660 Ti should have a roughly 60MHz average boost advantage over the GTX 1060 6GB when going by the official specs, in practice the cards end up within half that span. Which hints that NVIDIA's official average boost clock is a little more correctly grounded here than with the GTX 1060.
Power Consumption
Even though NVIDIA's video card prices for the xx60 cards have drifted up over the years, the same cannot be said for their power consumption. NVIDIA has set the reference specs for the card at 120W, and relative to their other cards this is exactly what we see. Looking at FurMark, our favorite pathological workload that's guaranteed to bring a video card to its maximum TDP, the GTX 960, GTX 1060, and GTX 1660 are all within 4 Watts of each other, exactly what we'd expect to see from the trio of 120W cards. It's only in Battlefield 1 do these cards pull apart in terms of total system load, and this is due to the greater CPU workload from the higher framerates afforded by the GTX 1660 Ti, rather than a difference at the card level itself.
Meanwhile when it comes to idle power consumption, the GTX 1660 Ti falls in line with everything else at 83W. With contemporary desktop cards, idle power has reached the point where nothing short of low-level testing can expose what these cards are drawing.
As for the EVGA card in its natural state, we see it draw almost 10W more on the dot. I'm actually a bit surprised to see this under Battlefield 1 as well since the framerate difference between it and the reference-clocked card is barely 1%, but as higher clockspeeds get increasingly expensive in terms of power consumption, it's not far-fetched to see a small power difference translate into an even smaller performance difference.
All told, NVIDIA has very good and very consistent power control here. and it remains one of their key advantages over AMD, and key strengths in keeping their OEM customers happy.
Temperature
Looking at temperatures, there are no big surprises here. EVGA seems to have tuned their card for high performance cooling, and as a result the large, 2.75-slot card reports some of the lowest numbers in our charts, including a 67C under FurMark when the card is capped at the reference spec GTX 1660 Ti's 120W limit.
Noise
Turning again to EVGA's card, despite being a custom open air design, the GTX 1660 Ti XC Black doesn't come with 0db idle capabilties and features a single smaller but higher-RPM fan. The default fan curve puts the minimum at 33%, which is indicative that EVGA has tuned the card for cooling over acoustics. That's not an unreasonable tradeoff to make, but it's something I'd consider more appropriate for a factory overclocked card. For their reference-clocked XC card, EVGA could have very well gone with a less aggressive fan curve and still have easily maintained sub-80C temperatures while reducing their noise levels as well.
157 Comments
View All Comments
Psycho_McCrazy - Tuesday, February 26, 2019 - link
Given that 21:9 monitors are also making great inroads into the gamer's purchase lists, can benchmark resolutions also include 2560.1080p, 3440.1440p and (my wishlist) 3840.1600p benchies??eddman - Tuesday, February 26, 2019 - link
2560x1080, 3440x1440 and 3840x1600That's how you right it, and the "p" should not be used when stating the full resolution, since it's only supposed to be used for denoting video format resolution.
P.S. using 1080p, etc. for display resolutions isn't technically correct either, but it's too late for that.
Ginpo236 - Tuesday, February 26, 2019 - link
a 3-slot ITX-sized graphics card. What ITX case can support this? 0.bajs11 - Tuesday, February 26, 2019 - link
Why can't they just make a GTX 2080Ti with the same performance as RTX 2080Ti but without useless RT and dlss and charge something like 899 usd (still 100 bucks more than gtx 1080ti)?i bet it will sell like hotcakes or at least better than their overpriced RTX2080ti
peevee - Tuesday, February 26, 2019 - link
Do I understand correctly that this thing does not have PCIe4?CiccioB - Thursday, February 28, 2019 - link
No, they have not a PCIe4 bus.Do you think they should have?
Questor - Wednesday, February 27, 2019 - link
Why do I feel like this was a panic plan in an attempt to bandage the bleed from RTX failure? No support at launch and months later still abysmal support on a non-game changing and insanely expensive technology.I am not falling for it.
CiccioB - Thursday, February 28, 2019 - link
Yes, a "panic plan" that required about 3 years to create the chips.3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have.
They didn't know that the concurrent would have played with the only weapon it was left to it to battle, that is prize as they could not think that the concurrent was not ready with a beefed up architecture capable of the sa functionalities.
So, yes, they panicked for sure. They were not prepared to anything of what is happening,
Korguz - Friday, March 1, 2019 - link
" that required about 3 years to create the chips.3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have. "
and where did you read this ? you do understand, and realize... is IS possible to either disable, or remove parts of an IC with out having to spend " about 3 years " to create the product, right ? intel does it with their IGP in their cpus, amd did it back in the Phenom days with chips like the Phenom X4 and X3....
CiccioB - Tuesday, March 5, 2019 - link
So they created a TU116, a completely new die without RT and Tensor Core, to reduce the size of the die and lose about 15% of performance with respect to the 2060 all in 3 months because they panicked?You probably have no idea of what are the efforts to create a 280mm^2 new die.
Well, by this and your previous posts you don't have idea of what you are talking about at all.