Compute & Synthetics

Shifting gears, we'll look at the compute and synthetic aspects of the RTX 2060 (6GB). As a cut-down configuration of the TU106 GPU found in the RTX 2070, we should expect a similar progression in results.

Starting off with GEMM tests, the RTX 2070's tensor cores are pulled into action with half-precision matrix multiplication, though using binaries originally compiled for Volta. Because Turing is backwards compatible and in the same compute capability family as Volta (sm_75 compared to Volta's sm_70), the benchmark continues to work out-of-the-box, though without any particular Turing optimizations.

Compute: General Matrix Multiply Half Precision (HGEMM)

Compute: General Matrix Multiply Single Precision (SGEMM)

For Turing-based GeForce, FP32 accumulation on tensors is capped at half-speed, thus resulting in the observed halved performance. Aside from product segmentation, that higher-precision mode is primarily for deep learning training purposes, something that GeForce cards wouldn't be doing in games or consumer tasks.

Moving on, we have CompuBench 2.0, the latest iteration of Kishonti's GPU compute benchmark suite offers a wide array of different practical compute workloads, and we’ve decided to focus on level set segmentation, optical flow modeling, and N-Body physics simulations.

Compute: CompuBench 2.0 - Level Set Segmentation 256Compute: CompuBench 2.0 - N-Body Simulation 1024KCompute: CompuBench 2.0 - Optical Flow

Moving on, we'll also look at single precision floating point performance with FAHBench, the official Folding @ Home benchmark. Folding @ Home is the popular Stanford-backed research and distributed computing initiative that has work distributed to millions of volunteer computers over the internet, each of which is responsible for a tiny slice of a protein folding simulation. FAHBench can test both single precision and double precision floating point performance, with single precision being the most useful metric for most consumer cards due to their low double precision performance.

Compute: Folding @ Home Single Precision

Next is Geekbench 4's GPU compute suite. A multi-faceted test suite, Geekbench 4 runs seven different GPU sub-tests, ranging from face detection to FFTs, and then averages out their scores via their geometric mean. As a result Geekbench 4 isn't testing any one workload, but rather is an average of many different basic workloads.

Compute: Geekbench 4 - GPU Compute - Total Score

We'll also take a quick look at tessellation performance.

Synthetic: TessMark, Image Set 4, 64x Tessellation

Finally, for looking at texel and pixel fillrate, we have the Beyond3D Test Suite. This test offers a slew of additional tests – many of which use behind the scenes or in our earlier architectural analysis – but for now we’ll stick to simple pixel and texel fillrates.

Synthetic: Beyond3D Suite - Pixel FillrateSynthetic: Beyond3D Suite - Integer Texture Fillrate (INT8)

 

Total War: Warhammer II Power, Temperature, and Noise
Comments Locked

134 Comments

View All Comments

  • just4U - Wednesday, January 23, 2019 - link

    wait late to this and likely no one will read it but shoot you never know. I have Vega cards. I undervolt and overclock. They work great.
  • sing_electric - Monday, January 7, 2019 - link

    Here's the thing, though, right now, there ISN'T a card on the market that offers anything like that level of performance for that price, if you can actually buy one for close to MSRP. The RX 590 is almost embarrassing in this test; a recently-launched card (though based on older tech) for $60 less than the 2060 but offering nowhere near the performance. The way I read the chart on performance/prices, there's good value at ~$200 (for a 580 card), then no good values up till you get to the $350 2060 (assuming it's available for close to MSRP). If AMD can offer the Vega 56 for say, $300 or less, it becomes a good value, but today, the best price I can find on one is $370, and that's just not worth it.
  • jrs77 - Monday, January 7, 2019 - link

    I don't say, that the 2060 isn't good value, but it simply is priced way too high to be a midrange card, which the xx60-series is supposed to be.
    Midrange = $1000 gaming-rig and that only leaves some $200-250 for the GPU. And as I wrote, even the 1060 was out of that pricerange for most of the last two years.
  • sing_electric - Monday, January 7, 2019 - link

    I totally get your point - but to some extent, it's semantics. I'd never drop the ~$700 that it costs to get a 2080 today, but given that that card exists and is sold to consumers as a gaming card, it is now the benchmark for "high end." The RTX 2060 is half that price, so I guess is "mid range," even if $350 is more than I'd spend on a GPU.

    We've seen the same thing with phones - $700 used to be 'premium' but now the premium is more like $1k.

    The one upside of all this is that the prices mean that there's likely to be a lot of cards like the 1060/1070/RX 580 in gaming rigs for the next few years, and so game developers will likely bear that in mind when developing titles. (On the other hand, I'm hoping maybe AMD or Intel will release something that hits a much better $/perf ratio in the next 2 years, finally putting pricing pressure on Nvidia at the mid/high end which just doesn't exist at the moment.)
  • Bluescreendeath - Monday, January 7, 2019 - link

    It could be possible that the GTX2060 is not midranged but lower high range card. Most XX60 cards in the past were midranged, but they were not all midranged. Though most past XX60 cards have been midranged and cost around $200-$300, if you go to the GTX200 series, the GTX260's MSRP was $400 and was more of an upper ranged card. The Founder's Edition of the 1060 also launched at $300.
  • dave_the_nerd - Monday, January 7, 2019 - link

    Weeeeeeeeeelllll.... before all the mining happened, the 970 was a pretty popular card at $300-$325. (At one point iirc it was the single most popular discrete GPU on Steam's hardware survey.)
  • Vayra - Wednesday, January 9, 2019 - link

    Yeah, I think 350 is just about the maximum Nvidia can charge for midrange. The 970 had the bonus of offering 780ti levels of performance very shortly after that card launched. Today, we're looking at almost 3 years for such a jump (1080 > 2060).
  • StrangerGuy - Wednesday, January 9, 2019 - link

    I paid an inflated $450 for my launch 1070 2.5 years, and this 2060 is barely faster at $100 less. Godawful value proposition especially when release dates are taken into consideration.
  • ScottSoapbox - Monday, January 7, 2019 - link

    I wonder if custom 2060 cards will add 2GB more VRAM and how much that addition will cost.
  • A5 - Monday, January 7, 2019 - link

    It's been a *long* time since I've seen a board vendor offer a board with more VRAM than spec'd by the GPU maker. I would be surprised if anyone did it...easier to point people at the 2070.

Log in

Don't have an account? Sign up now