Final Words

We’re now four GPUs into the NVIDIA Turing architecture product stack, and while NVIDIA’s latest processor has pitched us a bit of a curve ball in terms of feature support, by and large NVIDIA is holding to a pretty consistent pattern with regards to product performance, positioning, and pricing. Which is to say that the company has a very specific product stack in mind for this generation, and thus far they’ve been delivering on it with the kind of clockwork efficiency that NVIDIA has come to be known for.

With the launch of the GeForce GTX 1660 Ti and the TU116 GPU underpinning it, we’re finally seeing NVIDIA shift gears a bit in how they’re building their cards. Whereas the four RTX 20 series cards are all loosely collected under the umbrella of “premium features for a premium price”, the GTX 1660 Ti goes in the other direction, dropping NVIDIA’s shiny RTX suite of effects for a product that is leaner and cheaper to produce. As a result, the new card offers a bigger improvement on a price/performance basis (in current games) than any of the other Turing cards, and with a sub-$300 price tag, is likely to be more warmly received than the other cards.

Looking at the numbers, the GeForce GTX 1660 Ti delivers around 37% more performance than the GTX 1060 6GB at 1440p, and a very similar 36% gain at 1080p. So consistent with the other Turing cards, this is not quite a major generational leap in performance; and to be fair to NVIDIA they aren’t really claiming otherwise. Instead, NVIDIA is mostly looking to sell this card to current GTX 960 and R9 380 users; people who skipped the Pascal generation and are still on 28nm parts. In which case, the GTX 1660 Ti offers well over 2x the performance of these cards, with performance frequently ending up neck-and-neck with what was the GTX 1070.

Meanwhile, taking a look at power efficiency, it’s interesting to note that for the GTX 1660 Ti NVIDIA has been able to hold the line on power consumption: performance has gone up versus the GTX 1060 6GB, but card power consumption hasn’t. Thanks to this, the GTX 1660 Ti is not just 36% faster, it’s 36% percent more efficient as well. The other Turing cards have seen their own efficiency gains as well, but with their TDPs all drifting up, this is the largest (and purest) efficiency gain we’ve seen to date, and probably the best metric thus far for evaluating Turing’s power efficiency against Pascal’s.

The end result of these improvements in performance and power efficiency is that NVIDIA has once again put together a very solid Turing-based video card. And while its performance gains don’t make the likes of the GTX 1060 6GB and Radeon RX 590 obsolete overnight, it’s a clear case of out with the old and in with the new for the mainstream video card market. The GTX 1060 is well on its way out, and meanwhile AMD is going to have to significantly reposition the $279 RX 590. The GTX 1660 Ti cleanly beats it in performance and power efficiency, delivering 25% better performance for a bit over half the power consumption.

If anything, having cleared its immediate competitors with superior technology, the only real challenge NVIDIA will face is convincing consumers to pay $279 for a xx60 class card, and which performs like a $379 card from two years ago. In this respect the GTX 1660 Ti is a much better value proposition than the RTX 2060 above it, but it’s also more expensive than the GTX 1060 6GB it replaces, so it runs the risk of drifting out of the mainstream market entirely. Thankfully pricing here is a lot more grounded than the RTX 20 series cards, but the mainstream market is admittedly more price sensitive to begin with.

This also means that AMD remains a wildcard factor; they have the option of playing the value spoiler with cheap RX 590 cards, and I’m curious to see how serious they really are about bringing the RX Vega 56 in to compete with NVIDIA’s newest card. Our testing shows that RX Vega 56 is still around 5% faster on average, so AMD could still play a new version of the RX 590 gambit (fight on performance and price, damn the power consumption).

Perhaps the most surprising part about any of this is that despite the fact that the GTX 1660 Ti very notably omits NVIDIA’s RTX functionality, I’m not convinced RTX alone is going to sway any buyers one way or another. Since the RTX 2060 is both a faster and more expensive card, I quickly tabled the performance and price increases for all of the Turing cards launched thus far.

GeForce: Turing versus Pascal
  List Price
(Turing)
Relative Performance Relative
Price
Relative
Perf-Per-Dollar
RTX 2080 Ti vs GTX 1080 Ti $999 +32% +42% -7%
RTX 2080 vs GTX 1080 $699 +35% +40% -4%
RTX 2070 vs GTX 1070 $499 +35% +32% +2%
RTX 2060 vs GTX 1060 6GB $349 +59% +40% +14%
GTX 1660 Ti vs GTX 1060 6GB $279 +36% +12% +21%

The long and short of matters is that with the cheapest RTX card costing an additional $80, there’s a much stronger rationale to act based on pricing than feature sets. In fact considering just how amazingly consistent the performance gains are on a generation-by-generation basis, there’s ample evidence that NVIDIA has always planned it this way. Earlier I mentioned that NVIDIA acts with clockwork efficiency, and with nearly ever Turing card improving over its predecessor by roughly 35% (save the RTX 2060 with no direct predecessor), it’s amazing just how consistent NVIDIA’s product positioning is here. If the next GTX 16 series card isn’t also 35% faster than its predecessor, then I’m going to be amazed.

In any case, this makes a potentially complex situation for card buyers pretty simple: buy the card you can afford – or at least, the card with the performance you’re after – and don’t worry about whether it’s RTX or GTX. And while it’s unfortunate that NVIDIA didn’t include their RTX functionality top-to-bottom in the Turing family, there’s also a good argument to be had that the high-performance cost means that it wouldn’t make sense on a mainstream card anyhow. At least, not for this generation.

Last, but not least, we have the matter of EVGA’s GeForce GTX 1660 Ti XC Black GAMING. As this is launch without reference cards, we’re going to see NVIDIA’s board partners hit the ground running with their custom cards. And in true EVGA tradition, their XC Black GAMING is a solid example of what to expect for a $279 baseline GTX 1660 Ti card.

Since this isn’t a factory overclocked card, I’m a bit surprised that EVGA bothered to ship it with an increased 130W TDP. But I’m also glad they did, as the fact that it only improves performance by around 1% versus the same card at 120W is a very clear indicator that the GTX 1660 Ti is not meaningfully TDP limited. Overclocking will be another matter of course, but at stock this means that NVIDIA hasn’t had to significantly clamp down on power consumption to hit their power targets.

As for EVGA’s card design, I have to admit a triple-slot cooler is an odd choice for a 130W card – a standard double-wide card would have been more than sufficient for that kind of TDP – but in a market that’s going to be full of single and dual fan cards it definitely stands out from the crowd; and quite literally so, in the case of NVIDIA’s own promotional photos. Meanwhile I’m not sure there’s much to be said about EVGA’s software that we haven’t said a dozen times before: in EVGA Precision remains some of the best overclocking software on the market. And with such a beefy cooler on this card, it’s certainly begging to be overclocked.

Power, Temperature, and Noise
POST A COMMENT

157 Comments

View All Comments

  • Midwayman - Friday, February 22, 2019 - link

    I feel like they don't realize that until they improve the performance per $$$ there is very little reason to upgrade. I'm happy sitting on an older card until that changes. Though If I were on a lower end card I might be kicking myself for not just buying a better card years ago. Reply
  • eva02langley - Friday, February 22, 2019 - link

    Since the bracket price moved up so much for relative performance at higher price point from the last generation, there is absolutely no reason for upgrading. That is different if you need a GPU. Reply
  • zmatt - Friday, February 22, 2019 - link

    Agreed. It's kind of wild that I have to pay $350 to get on average 10fps better than my 980ti. If I want a real solid performance improvement I have to essentially pay the same today as when the 980ti was brand new. The 2070 is anywhere between $500-$600 right now depending on model and features. IIRC the 980ti was around $650. And according to Anantech's own benchmarks it gives on average 20fps better performance. That 2 generations, 5 years and I get 20fps for $50 less? No. I should have a 100% performance advantage for the same price by this point. Nvidia is milking us. I'm eyeballing it a bit here but the 2080Ti is a little bit over double the performance of a 980Ti. It should cost less than $700 to be a good deal. Reply
  • Samus - Friday, February 22, 2019 - link

    I agree in that this card is a tough sell over a RTX2060. Most consumers are going to spend the extra $60-$70 for what is a faster, more well-rounded and future-proof card. If this were $100 cheaper it'd make some sense, but it isn't. Reply
  • PeachNCream - Friday, February 22, 2019 - link

    I'm not so sure about the value prospects of the 2070. The banner feature, real-time ray tracing, is quite slow even on the most powerful Turing cards and doesn't offer much of a graphical improvement for the variety of costs involved (power and price mainly). That positions the 1660 as a potentially good selling graphics card AND endangers the adoption of said ray tracing such that it becomes a less appealing feature for game developers to implement. Why spend the cash on supporting a feature that reduces performance and isn't supported on the widest possible variety of potential game buyers' computers and why support it now when NVIDIA seems to have flinched and released the 1660 in a show of a lack of commitment? Already game studios have ditched SLI now that DX12 pushed support off GPU companies and into price-sensitive game publisher studios. We aren't even seeing the hyped up feature of SLI between a dGPU and iGPU that would have been an easy win on the average gaming laptop due in large part to cost sensitivity and risk aversion at the game studios (along with a healthy dose of "console first, PC second" prioritization FFS). Reply
  • GreenReaper - Friday, February 22, 2019 - link

    What I think you're missing is that the DirectX rendering API set by Microsoft will be implemented by all parties sooner or later. It really *does* met a need which has been approximated in any number of ways previously. Next generation consoles are likely to have it as a feature, and if so all the AAA games for which it is relevant are likely to use it.

    Having said that, the benefit for this generation is . . . dubious. The first generation always sells at a premium, and having an exclusive even moreso; so unless you need the expanded RAM or other features that the higher-spec cards also provide, it's hard to justify paying it.
    Reply
  • alfatekpt - Monday, February 25, 2019 - link

    I'm not sure about that. It is also an increase in thermals and power consumption that also costs money overtime. RTX advantage is basically null at that point unless you want to play at low FPS so 2060 advantage is 'merely' raw performance.

    For most people and current games 1160 already offers ultra great performance so not sure if people gonna shell out even more money for the 2060 since 1160 is already a tad expensive.

    1160 seems to be an awesome combination of performance and efficiency. Would it be better $50 lower? of course but why? since they don't have real competition from AMD...
    Reply
  • Strunf - Friday, February 22, 2019 - link

    Why would nvidia give up of a market that costs them almost nothing ? if 5 years from now they do cloud gaming then they pretty much are still doing GPU.
    Anyways even in 5 years cloud gaming will still be a minor part of the GPU market.
    Reply
  • MadManMark - Friday, February 22, 2019 - link

    "They are pushing prices up and up but that's not a long term strategy."

    That comment completely ignores the massive increase in value over both the RX 590 and Vega 56. Nividia produces a card that both makes the RX590 at the same pricepoint completely unjustifiable, and prompts AMD to cut the price of the Vega 56 in HALF overnight, and you are saying that it is *Nvidia* not *AMD* that is charging high prices?!?! I've always thought the AMD GPU fanatics who think AMD delivers more value were somewhat delusional, but this comment really takes the cake.
    Reply
  • eddman - Saturday, February 23, 2019 - link

    It's not about AMD. The launch prices have clearly been increased compared to previous gen nvidia cards.

    Even this card is $30 more than the general $200-250 range.
    Reply

Log in

Don't have an account? Sign up now