Final Words

As we wrap up, it’s clear that judging the RTX 2070 involves the same themes that surfaced in the RTX 2080 Ti and 2080 review. First is for the forward-looking featuresets that have yet to publicly launch. Another, and closely intertwined, is the premium pricing that is based on those features, as opposed to being based on conventional gaming performance. And lastly is the existing competition in the form of Pascal, especially where the RTX cards fall in the same performance tier.

For the RTX 2070 Founders Edition, those themes are more relevant and harder to dismiss. By its nature, the card is an entry-level model for consumers interested in real time raytracing and other RTX platform features, as well as the traditional high-end card for prospective enthusiasts and upgraders on a budget. In the past couple generations, these ‘enthusiast value’ parts have essentially provided last-gen flagship performance (or better), at non-flagship prices. For example:

  • GTX 1070 for GTX 980 Ti
  • GTX 970 for GTX 780 Ti
  • GTX 770 refresh of GTX 680
  • RX 580/480 for R9 390
  • R9 390 refresh of R9 290
  • R9 280X refresh of HD 7970

Going back to the numbers, the RTX 2070 Founders Edition TDP and boost clock tweaks only amount to around a 4% gain over the reference 2070 at 4K. The difference is not much in the grand scheme of things, but the setup makes more sense when looking at the GTX 1080 competition. The reference RTX 2070 is faster than the GTX 1080 at 4K and 1440p by only around 10%, a gap that is easily closed by factory-overclocked custom cards.

By hardware resources, the RTX 2070 was expected to be around 75% of the 2080. But Founders-to-Founders and reference-to-reference, the RTX 2070 is bringing around 83% of the RTX 2080’s 1440p performance (and 82% of 4K performance). So the performance gap is comparable to previous generations, where the GTX 1070 brought 81% of the performance of the GTX 1080, and the GTX 970 brought 87% of the GTX 980. Except here the RTX 2080 is only managing GTX 1080 Ti level performance for traditional gaming.

Looking back at the Pascal launch, the GTX 1070 brought a 57% 1440p performance gain over GTX 970, which was substantive but with its $450 Founders Edition pricing, not necessarily a must-buy for GTX 970 owners. On the other hand, GTX 770/670 owners had a lot to gain from that upgrade.

Here with Turing, the RTX 2070 is ahead of the GTX 1070 reference-to-reference around 35% and 36% at 1440p and 4K, respectively. In its Founders Edition guise, the difference is around 41% for both resolutions. Either way, the performance lies somewhere between the GTX 1080 and 1080 Ti, except with a $600 Founders Edition price. In that sense, it offers less than last generation but at a higher price, the premium being tied to real time raytracing and other hardware-accelerated features. And when those features finally release, there's no clear sense of the quality or resolution compromises necessary to run those features.

For current GTX 10 series owners, the RTX 2070 is largely a side-grade, offering known performance for possibily worse power efficiency. For those with low-end cards, or 900 series and older products, the $500/$600 budget pulls in a number of other alternatives: the GTX 1080, RX Vega 64, or even the GTX 1070 Ti. As far as standard $500 MSRP pricing goes, for which some cards are priced so currently, it helps the RTX 2070 stay in the price/performance race, where at $600 that might be a $100+ premium over a competing product. In particular, the sub $500 GTX 1080 cards are a major spoiler for the RTX 2070, offering equivalent performance at lower price. A prospective RTX 2070 buyer will have to be honest with themselves on utilizing RTX features when the time comes, and any intentions they might have on upgrading monitors for HDR, higher resolution and/or refresh rate, and variable refresh rate technology.

Overclocking
Comments Locked

121 Comments

View All Comments

  • TheinsanegamerN - Tuesday, October 16, 2018 - link

    If you had truly been building since the 486 era, then you would know that, despite the price jumps, computers today are MONUMENTALLY cheaper then they were in the 90s. You dont see $4000 desktops in stores today, you sure did in 1991.
  • TheinsanegamerN - Tuesday, October 16, 2018 - link

    I mean, seriously, a 4MB RAM stick cost $140 in 1994, and you care complaining that 32 GB cost $300 today?
  • Dragonstongue - Tuesday, October 16, 2018 - link

    hold yer horses there lad, lets us some calcs.
    $2000 in 1991 would be $3,684.96 today...I see LOTS of computers people build that are ~ this level
    and $3600 does not buy "cream of the crop" parts today, very high end no doubt, but also not "best of the best"

    use a different number 250 1991 money which is ~ mid range gpu pricing these days would be $460.62.

    I guess to put a slightly different way, it depends on what one is buying to see that the "value" of the $ spent is often times equivalent much worse or only "slightly" better then we have today.

    We may get "more" for the $, but, all things being equal also pay more for what is received, I think the "only" thing in my books that has gotten far less expensive taking everything into account if hard drive pricing 50 in 1991 would be 92.12 today, for 92 you can pretty easily get 2tb hard drive which is WAY more substantial of a hard drive then you could get in pretty much every regard than 50 would have got you in 1991 ^.^
  • Yojimbo - Tuesday, October 16, 2018 - link

    The hard drive you could get for $50 in 1991 was a 0 MB hard drive.

    I don't understand why you decided to use $2,000 in 1991 when the post you replied to talked about $4,000 in 1991. That's over $7,200 today. A $2,000 computer in 1991 was pretty mid range. So what;s the big deal if $3,600 does not buy "cream of the crop" parts today? $3,600 today gets you something certainly high end and not mid-range. Also, you are talking about driving a range of visuals that just didn't exist for consumers back in 1991. You can spend a good chunk of that $3,600 on a decent 4K monitor, driving almost 4 times the pixels of a standard 1080p monitor and over 8 times the pixels of running at 720p. I don't think these massive differences in display capabilities existed back then. Your extra money back then was mostly going towards a faster CPU, faster I/O, and enhanced sound capabilities.
  • Vayra - Monday, October 22, 2018 - link

    You wot? Back in 1994, 1600x1200 was a thing already, and the vast majority played on 800x600 or worse. In fact, even that was still a high end res.
  • Yojimbo - Monday, October 22, 2018 - link

    So who played at 1600x1200? I mean 8K has been a thing for several years but who plays games at it? The resolution scaling game didn't really kick off until later. In the 1990s and early 2000s there was a whole lot of relatively easy visual quality improvements to be achieved through better algorithms. I don't believe people were spending massive amounts of money buying monitors with very small dot pitches so they could play games at high resolutions with crisper images. I'm sure they spent more for bigger monitors, but it was probably getting a 17 inch versus a 15 inch. That sort of difference in size doesn't induce someone to need a bigger GPU to push more pixels.
  • Yojimbo - Monday, October 22, 2018 - link

    "GPU" should read "graphics accelerator".
  • Yojimbo - Tuesday, October 16, 2018 - link

    Yeah, if I remember correctly my father bought me a Dell 486SX/25 with 4 MB of RAM, a monitor, keyboard, mouse, 120 MB hd, 3.5 in and 5.25 in floppy drives. It just had the PC speakers and a standard 2d graphics adapter. It cost $1,600 I think, which is $3,000 today. PC gaming is much cheaper today.

    The GPU has become more and more important to gaming performance in relation to the other components of the system. So people spend more money on their GPUs to achieve higher performance and no longer spend $1,000 for a CPU or significantly extra money for super fast RAM or a super fast hard drive.
  • DanNeely - Tuesday, October 16, 2018 - link

    My parents got a similar spec no-name white box PC with non accelerated graphics adapter for $1100 in summer '93. Upgrades over the next few years were 4mb more ram, CDROM+sound blaster clone, ~500 MB hdd (I think, not 100% sure on the capacity), 14.4 modem. I bought the ram and about half the HDD price as a teen, remainder were Christmas purchases.
  • Eletriarnation - Tuesday, October 16, 2018 - link

    The 970 is still fine so you really don't need to worry. Even if you did need an upgrade, prices are dropping as they always have for the last generation and if you spent the same amount of money you spent for a 970 at launch now you'd probably be able to get a 1080 so what's really the problem? Nvidia is making the 20xx series larger and more expensive because other people are willing to pay for them, it's as simple as that.

Log in

Don't have an account? Sign up now