Closing Thoughts

As we bring this to a close, we are again revisiting the central themes of the GeForce RTX 20 series across the launches: forward-looking featuresets that are not widely available, premium pricing based on those hardware features, and competition with existing Pascal products due to comparable conventional gaming performance. This time, however, the last two have played out a little differently. Pascal-based GTX 1080, 1070 Ti, and 1070s are not so readily available and/or are at higher prices. And although the price premium pushes the GeForce RTX 2060 (6GB) out of the traditional mainstream home of the x60 part, it puts it firmly in contention against the Radeon RX Vega cards and to a lesser extent the recently-launched Radeon RX 590.

As a whole, in developing Turing and the GeForce RTX 20 series, NVIDIA has invested heavily in hybrid rendering, offering less price-to-performance for conventional gaming than usual for new GPU architectures. This has been compounded by excess Pascal inventory, a result of the cryptocurrency mining demand of the past year or two. The RTX 2060 (6GB) is no exception, and while it is a better price-to-performance offering relative to its older siblings, it’s simply no longer a ‘mainstream’ video card at $350, instead occupying the ‘value-enthusiast’ space.

For conventional gaming, if the RTX 2080 is akin to the GTX 1080 Ti and the RTX 2070 like the GTX 1080, then the RTX 2060 (6GB) truly performs like the GTX 1070 Ti. By the numbers, the RTX 2060 (6GB) is 2-3% faster than the GTX 1070 Ti at 1440p and 1080p, though comparison becomes a wash at 4K. In turn, reference-to-reference the RTX 2060 (6GB) is around 11% faster than the RX Vega 56 at 1440p/1080p, narrowing to 8% at 4K. There are hints that the 6GB framebuffer might be limiting, especially with unexpectedly low 99th percentile framerates at Wolfenstein II in 4K, though nothing to the extent that older 4GB GTX 900 series cards have experienced.

Potential VRAM bottlenecks is something that needs further investigation, but more to the point, this is a $350 card featuring only 6GB VRAM. Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379, but it also means that NVIDIA has essentially regressed in VRAM capacity at this price point. In terms of the larger RTX lineup, 6GB is a bit more reasonable progression compared to the 8GB of the RTX 2070 and RTX 2080, but it is something to revisit if there are indeed lower-memory cut-down variants of the RTX 2060 on the way, or if games continue the historical path of always needing more framebuffer space. The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.

Generationally, the RTX 2060 (6GB) does bring more to the table, offering roughly 86% of the performance of the RTX 2070 for 70% of the price. Or against its direct predecessor, the GTX 1060 6GB, it’s faster by around 59%. In context, the GTX 1060 6GB was 80-85% faster than the GTX 960 (2GB) at launch, where presently that gap is more along the lines of 2X or more, with increased framebuffer the primary driver. But at $200, the GTX 960 was a true mainstream card, as was the GTX 1060 6GB at its $249 MSRP, despite the $299 Founders Edition pricing.

What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal, so if AMD wants to keep their Vega cards as viable market competitors, they are going to have to reduce prices. Reference-to-reference, the RTX 2060 (6GB) is already bringing around 95% of RX Vega 64 performance, so card pricing will make all the difference. The same goes for the RX 590, whose position in the ‘performance gap’ between the RX Vega 56 and RX 580 is now shared. And alongside potential price changes, there are still the value-adds of game bundles and FreeSync compatibility.

At least, that would have been the straightforward case for AMD if not for yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.

Like the RTX 2070, the RTX 2060 (6GB) is less suited as an option for most high-end GTX 10 series owners, and with 6GB VRAM as it’s a little less tempting than it could be as a move up from the GTX 1060 6GB or GTX 980 Ti. The card offers known performance along the lines of the GTX 1070 Ti and at very similar power consumption, but brings better value than existing higher-end RTX 20 series models. And this time, there’s less of a spoiler effect from older Pascal models.

Compared to previous generations, it’s not breaking the price-to-performance curve, as it is still an RTX card and pulling double-duty as the new entry-point for RTX platform support. That being said, there is no mincing words about the continuing price creep of the past two GeForce series. The price-to-performance characteristics of the RTX 2070, 2080, and 2080 Ti is what renders the RTX 2060 (6GB) a better value in comparison, and not necessarily because it is great value in absolute terms. But as an upgrade from older mainstream cards, the RTX 2060 (6GB) price point is a lot more reasonable than the RTX 2070’s $500+, where there more of the price premium is from forward-looking hardware-accelerated features like realtime raytracing.

So the RTX 2060 (6GB) would be the most suitable for gamers that aren’t gung-ho early adopters or longtime enthusiasts. The caveat is on the 6GB framebuffer, keeping in mind that the 4GB GTX 980 and 970 now punch below their weight in certain games, given the trends of HDR, HD texture packs, high-refresh rates, and more. Beyond that, the RTX 2060 (6GB) and RTX 2070 comes with a choice of Anthem or Battlefield V, as part of the new bundle. For a prospective buyer, this might not justify $500 but might outweigh $350, especially as realtime raytracing can be immediately tried out with Battlefield V. In the same way, upcoming support for adaptive sync could do the same for those looking to upgrade to a monitor with variable refresh rate.

Power, Temperature, and Noise
Comments Locked

134 Comments

View All Comments

  • B3an - Monday, January 7, 2019 - link

    More overpriced useless shit. These reviews are very rarely harsh enough on this kind of crap either, and i mean tech media in general. This shit isn't close to being acceptable.
  • PeachNCream - Monday, January 7, 2019 - link

    Professionalism doesn't demand harshness. The charts and the pricing are reliable facts that speak for themselves and let a reader reach conclusions about the value proposition or the acceptability of the product as worthy of purchase. Since opinions between readers can differ significantly, its better to exercise restraint. These GPUs are given out as media samples for free and, if I'm not mistaken, other journalists have been denied pre-NDA-lift samples by blasting the company or the product. With GPU shortages all around and the need to have a day one release in order to get search engine placement that drives traffic, there is incentive to tenderfoot around criticism when possible.
  • CiccioB - Monday, January 7, 2019 - link

    It all depends on what is your definition of "shit".
    Shit may be something that for you costs too much (so shit is Porche, Lamborghini and Ferrari, but for some else, also Audi, BMW and Mercedes and for some one else also all C cars) or may be something that does not work as expected or under perform with respect to the resources it has.
    So for someone else it may be shit a chip that with 230mm^q, 256GB/s of bandwidth and 240W perform like a chip that is 200mm^2, 192GB/s of bandwidth and uses half the power.
    Or it may be a chip that with 480mm^2, 8GB of latest HBM technology and more than 250W perform just a bit better than a 314mm^2 chip with GDDR5X and that uses 120W less.

    On each one its definition of "shit" and what should be bought to incentive real technological progress.
  • saiga6360 - Tuesday, January 8, 2019 - link

    It's shit when your Porsche slows down when you turn on its fancy new features.
  • Retycint - Tuesday, January 8, 2019 - link

    The new feature doesn't subtract from its normal functions though - there is still an appreciable performance increase despite the focus on RTS and whatnot. Plus, you can simply turn RTS off and use it like a normal GPU? I don't see the issue here
  • saiga6360 - Tuesday, January 8, 2019 - link

    If you feel compelled to turn off the feature, then perhaps it is better to buy the alternative without it at a lower price. It comes down to how much the eye candy is worth to you at performance levels that you can get from a sub $200 card.
  • CiccioB - Tuesday, January 8, 2019 - link

    It's shit when these fancy new features are kept back by the console market that has difficult at handling less than half the polygons that Pascal can, let alone the new Turing CPUs.
    The problem is not the technology that is put at disposal, but it is the market that is held back by obsolete "standards".
  • saiga6360 - Tuesday, January 8, 2019 - link

    You mean held back by economics? If Nvidia feels compelled to sell ray tracing in its infancy for thousands of dollars, what do you expect of console makers who are selling the hardware for a loss? Consoles sell games, and if the games are compelling without the massive polygons and ray tracing then the hardware limitations can be justified. Besides, this hardly can be said of modern consoles that can push some form of 4K gaming at 30fps of AAA games not even being sold on PC. Ray tracing is nice to look at but it hardly justifies the performance penalties at the price point.
  • CiccioB - Wednesday, January 9, 2019 - link

    The same may be said for 4K: fancy to see but 4x the performance vs FulllHD is too much.
    But as you can se, there are more and more people looking for 4K benchmarks to decide which card to buy.
    I would trade better graphics vs resolution any day.
    Raytraced films on bluray (so in FullHD) are way much better than any rasterized graphics at 4K.
    The path for graphics quality has been traced. Bear with it.
  • saiga6360 - Wednesday, January 9, 2019 - link

    4K vs ray tracing seems like an obvious choice to you but people vote with their money and right now, 4K is far less cost prohibitive for the eye-candy choice you can get. One company doing it alone will not solve this, especially at such cost vs performance. We got to 4K and adaptive sync because it is an affordable solution, it wasn't always but we are here now and ray tracing is still just a fancy gimmick too expensive for most. Like it or not, it will take AMD and Intel to get on board for ray tracing on hardware across platforms, but before that, a game that truly shows the benefits of ray tracing. Preferably one that doesn't suck.

Log in

Don't have an account? Sign up now