Closing Thoughts

As we bring this to a close, we are again revisiting the central themes of the GeForce RTX 20 series across the launches: forward-looking featuresets that are not widely available, premium pricing based on those hardware features, and competition with existing Pascal products due to comparable conventional gaming performance. This time, however, the last two have played out a little differently. Pascal-based GTX 1080, 1070 Ti, and 1070s are not so readily available and/or are at higher prices. And although the price premium pushes the GeForce RTX 2060 (6GB) out of the traditional mainstream home of the x60 part, it puts it firmly in contention against the Radeon RX Vega cards and to a lesser extent the recently-launched Radeon RX 590.

As a whole, in developing Turing and the GeForce RTX 20 series, NVIDIA has invested heavily in hybrid rendering, offering less price-to-performance for conventional gaming than usual for new GPU architectures. This has been compounded by excess Pascal inventory, a result of the cryptocurrency mining demand of the past year or two. The RTX 2060 (6GB) is no exception, and while it is a better price-to-performance offering relative to its older siblings, it’s simply no longer a ‘mainstream’ video card at $350, instead occupying the ‘value-enthusiast’ space.

For conventional gaming, if the RTX 2080 is akin to the GTX 1080 Ti and the RTX 2070 like the GTX 1080, then the RTX 2060 (6GB) truly performs like the GTX 1070 Ti. By the numbers, the RTX 2060 (6GB) is 2-3% faster than the GTX 1070 Ti at 1440p and 1080p, though comparison becomes a wash at 4K. In turn, reference-to-reference the RTX 2060 (6GB) is around 11% faster than the RX Vega 56 at 1440p/1080p, narrowing to 8% at 4K. There are hints that the 6GB framebuffer might be limiting, especially with unexpectedly low 99th percentile framerates at Wolfenstein II in 4K, though nothing to the extent that older 4GB GTX 900 series cards have experienced.

Potential VRAM bottlenecks is something that needs further investigation, but more to the point, this is a $350 card featuring only 6GB VRAM. Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379, but it also means that NVIDIA has essentially regressed in VRAM capacity at this price point. In terms of the larger RTX lineup, 6GB is a bit more reasonable progression compared to the 8GB of the RTX 2070 and RTX 2080, but it is something to revisit if there are indeed lower-memory cut-down variants of the RTX 2060 on the way, or if games continue the historical path of always needing more framebuffer space. The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.

Generationally, the RTX 2060 (6GB) does bring more to the table, offering roughly 86% of the performance of the RTX 2070 for 70% of the price. Or against its direct predecessor, the GTX 1060 6GB, it’s faster by around 59%. In context, the GTX 1060 6GB was 80-85% faster than the GTX 960 (2GB) at launch, where presently that gap is more along the lines of 2X or more, with increased framebuffer the primary driver. But at $200, the GTX 960 was a true mainstream card, as was the GTX 1060 6GB at its $249 MSRP, despite the $299 Founders Edition pricing.

What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal, so if AMD wants to keep their Vega cards as viable market competitors, they are going to have to reduce prices. Reference-to-reference, the RTX 2060 (6GB) is already bringing around 95% of RX Vega 64 performance, so card pricing will make all the difference. The same goes for the RX 590, whose position in the ‘performance gap’ between the RX Vega 56 and RX 580 is now shared. And alongside potential price changes, there are still the value-adds of game bundles and FreeSync compatibility.

At least, that would have been the straightforward case for AMD if not for yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.

Like the RTX 2070, the RTX 2060 (6GB) is less suited as an option for most high-end GTX 10 series owners, and with 6GB VRAM as it’s a little less tempting than it could be as a move up from the GTX 1060 6GB or GTX 980 Ti. The card offers known performance along the lines of the GTX 1070 Ti and at very similar power consumption, but brings better value than existing higher-end RTX 20 series models. And this time, there’s less of a spoiler effect from older Pascal models.

Compared to previous generations, it’s not breaking the price-to-performance curve, as it is still an RTX card and pulling double-duty as the new entry-point for RTX platform support. That being said, there is no mincing words about the continuing price creep of the past two GeForce series. The price-to-performance characteristics of the RTX 2070, 2080, and 2080 Ti is what renders the RTX 2060 (6GB) a better value in comparison, and not necessarily because it is great value in absolute terms. But as an upgrade from older mainstream cards, the RTX 2060 (6GB) price point is a lot more reasonable than the RTX 2070’s $500+, where there more of the price premium is from forward-looking hardware-accelerated features like realtime raytracing.

So the RTX 2060 (6GB) would be the most suitable for gamers that aren’t gung-ho early adopters or longtime enthusiasts. The caveat is on the 6GB framebuffer, keeping in mind that the 4GB GTX 980 and 970 now punch below their weight in certain games, given the trends of HDR, HD texture packs, high-refresh rates, and more. Beyond that, the RTX 2060 (6GB) and RTX 2070 comes with a choice of Anthem or Battlefield V, as part of the new bundle. For a prospective buyer, this might not justify $500 but might outweigh $350, especially as realtime raytracing can be immediately tried out with Battlefield V. In the same way, upcoming support for adaptive sync could do the same for those looking to upgrade to a monitor with variable refresh rate.

Power, Temperature, and Noise
Comments Locked

134 Comments

View All Comments

  • CiccioB - Thursday, January 10, 2019 - link

    I would like to remind you that when the 4K interest begun there were cards like the 980TI and the Fury, both unable to cope with such a resolution.
    Did you ever write a single sentence against the fact that 4K was a gimmick useless to most people because it was too expensive to support?
    You may know that if you want to get to a point you have to start walking towards it. If you never start, you'll never reach it.
    nvidia started before any other one in the market. You find it a gimmick move. I find it real innovation. Does it costs too much for you? Yes, also Plasma panels had 4 zeros in their price tag at the beginning, but a certain point I could get one myself without going bankruptcy.

    AMD and Intel will come to the ray tracing table sooner than you think (that is next generation for AMD after Navi that is already finalized without the new computing units)
  • saiga6360 - Thursday, January 10, 2019 - link

    Here's the problem with that comparison, 4K is not simply about gaming while ray tracing is. 4K started in the movie industry, then home video, then finally games. There is a trend that the gaming industry couldn't avoid if it tried so yes, nvidia started it but its not like nobody was that surprised and many thought AMD will soon follow. Ray tracing in real time is a technical feat that not everyone will get on board right away. I do applaud nvidia for starting it but it's too expensive and that's a harder barrier to entry than 4K ever was.
  • maroon1 - Monday, January 7, 2019 - link

    wolfenstein 2 Uber texture is waste of memory. It does not look any different compared to ultra

    http://m.hardocp.com/article/2017/11/13/wolfenstei...

    Quote from this review
    " We also noticed no visual difference using "Uber" versus "Ultra" Image Streaming unfortunately. In the end, it’s probably not worth it and best just to use the "Ultra" setting for the best experience."
  • sing_electric - Monday, January 7, 2019 - link

    I wish the GPU pricing comparison charts included a relative performance index (even if it was something like the simple arithmetic mean of all the scores in the review).

    The 2060 looks like it's in a "sweet spot" for performance if you want to spend more less than $500 but are willing to spend more than $200, but you can't really tell that from the chart (though if you read the whole review it's clear). Spending the extra $80 to go from a 1060/RX 580 to a RX 590 doesn't net you much performance, OTOH, going from the $280 RX 580 to the $350 2060 gets you a very significant boost in performance.
  • Semel - Monday, January 7, 2019 - link

    "11% faster than the RX Vega 56 at 1440p/1080p, "

    A two fans card is faster than a terrible, underperforming due to a bad one fan design reference Vega card. Shocker.

    Now get a proepr Vega 56 card, undervolt it and OC it. And compare to OCed 2060.

    YOu are in for a surprise.
  • CiccioB - Monday, January 7, 2019 - link

    A GPU born for the computational task, with 480mm^2 of silicon thought for that, 8GB of expensive HBM and consuming 120W more being powned by a chip in the x60 class sold for the same price (and despite the silicon not all being used and benchmarked for the today games, the latter still preforms better, let's see when RTX compute units and tensor will be used for other tasks like ray tracing but also DLSS, AI and any other kind of effects. And do not forget about mesh shading).

    I wonder how low the price of that crap should go down before someone consider it a good deal.
    Vega chip failed miserably at its aim of making any competition to Pascal in both games, prosumer and professional market, now with this new cutted Turing chip Vega completely looses any meaning of even being produced. Each sold pieces is a rob to AMD's cash coffin and it will be EOF sooner than later.
    The problem for AMD is that until Navi they will have nothing to go against Turing (the 590 launch is a joke, can't you really thing a company that is serious in this market can do that, can you?) and will constantly loose money in the graphics division. And if Navi is not launched soon enough, they will lose a lot of money the more GPU they (under)sell. If launched too early they will loose money for using a not mature enough PP with lower yields (and boosting the voltage isn't really going to produce a #poorvolta(ge) device even at 7nm). These are the problem of being an underdog that needs latest expensive technological applications to create something that can vaguely being considered decent with respect to the competition.

    Let's hope Navi is not a flop as Polaris, or also the generation after Turing will cost even more, after the price have already gone up with Kepler, Maxwell and Pascal.

    Great job this GCN architecture! Great job Koduri!
  • nevcairiel - Monday, January 7, 2019 - link

    Comparing two bog standard reference cards is perfectly valid. If AMD wanted to shine there, they should've done a better job.
  • Retycint - Tuesday, January 8, 2019 - link

    Exactly. AMD shouldn't have pushed the Vega series so far past the performance/voltage sweet spot in the first place.
  • sing_electric - Tuesday, January 8, 2019 - link

    I mean, at that point, then, why bother releasing it? If you look at perf/watt, it's not really much of an improvement over Polaris.
  • D. Lister - Monday, January 14, 2019 - link

    @Semel: "...get a proepr Vega 56 card, undervolt it..."

    Why is AMD so bad at setting the voltage in their GPUs? How good their products can be if they can't even properly do something that even the average Joe weekend overclocker can figure out?

    Answer to the first question is: "They aren't. AMD sets those voltages because they know it is necessary to keep the GPU stable under load. So, when you think yourself more clever than a multi billion dollar tech giant and undervolt a Radeon, you make it less reliable outside of scripted benchmark runs.

Log in

Don't have an account? Sign up now