Compute & Synthetics

Shifting gears, we'll look at the compute and synthetic aspects of the GTX 1650. As we've seen the GTX 1660 Ti and GTX 1660 already, we aren't expecting anything too surprising here.

Beginning with CompuBench 2.0, the latest iteration of Kishonti's GPU compute benchmark suite offers a wide array of different practical compute workloads, and we’ve decided to focus on level set segmentation, optical flow modeling, and N-Body physics simulations.

Compute: CompuBench 2.0 - Level Set Segmentation 256

Compute: CompuBench 2.0 - N-Body Simulation 1024K

Compute: CompuBench 2.0 - Optical Flow

Moving on, we'll also look at single precision floating point performance with FAHBench, the official Folding @ Home benchmark. Folding @ Home is the popular Stanford-backed research and distributed computing initiative that has work distributed to millions of volunteer computers over the internet, each of which is responsible for a tiny slice of a protein folding simulation. FAHBench can test both single precision and double precision floating point performance, with single precision being the most useful metric for most consumer cards due to their low double precision performance.

Compute: Folding @ Home Single Precision

Next is Geekbench 4's GPU compute suite. A multi-faceted test suite, Geekbench 4 runs seven different GPU sub-tests, ranging from face detection to FFTs, and then averages out their scores via their geometric mean. As a result Geekbench 4 isn't testing any one workload, but rather is an average of many different basic workloads.

Compute: Geekbench 4 - GPU Compute - Total Score

In lieu of Blender, which has yet to officially release a stable version with CUDA 10 support, we have the LuxRender-based LuxMark (OpenCL) and V-Ray (OpenCL and CUDA).

Compute/ProViz: LuxMark 3.1 - LuxBall and Hotel

Compute/ProViz: V-Ray Benchmark 1.0.8

We'll also take a quick look at tessellation performance.

Synthetic: TessMark, Image Set 4, 64x Tessellation

Finally, for looking at texel and pixel fillrate, we have the Beyond3D Test Suite. This test offers a slew of additional tests – many of which we use behind the scenes or in our earlier architectural analysis – but for now we’ll stick to simple pixel and texel fillrates.

Synthetic: Beyond3D Suite - Pixel Fillrate

Synthetic: Beyond3D Suite - Integer Texture Fillrate (INT8)

Synthetic: Beyond3D Suite - Floating Point Texture Fillrate (FP32)

Total War: Warhammer II Power, Temperature, and Noise
Comments Locked

126 Comments

View All Comments

  • eva02langley - Sunday, May 5, 2019 - link

    Hey, Turing is a joke. The only thing Turing brought is a different price bracket. Nvidia took 2 years and half before releasing Turing... so I don't see the age of Polaris to be an issue when new cards are coming in a couple of months.
  • Ryan Smith - Saturday, May 4, 2019 - link

    "This is by far the most 1650 friendly review I have seen online."

    Having finally read the other GTX 1650 reviews (I don't read them beforehand, to avoid coloring my own video card reviews), I agree with you on that. Still, I stand by my article.

    AMD is by no means desperate here. But they are willing to take thinner profit margins than NVIDIA does. And that creates all kinds of glorious havoc in the sub-$200 video card market.

    No one card wins in all categories here; one has better performance, another has better power efficiency. So it makes things a little more interesting for buyers as they now need to consider what they are using a card for - and what attributes they value the most.

    Next to the GTX 1650, the RX 570 is really kind of a lumbering beast. The power consumption difference for the 11% performance advantage is quite high. But at the end of the day it's still 11% faster for the same price, so if you're buying on a pure price/performance basis, then it's an easy call to make.

    As for Navi, AMD will eventually have a successor of some sort for Polaris 11. However I'm not expecting it in Q3; AMD normally launches the low-end stuff later.
  • eva02langley - Sunday, May 5, 2019 - link

    You can stand by your article, but it doesn't mean you are right because of it. You are living in LALA land Ryan for even believing that 75W difference is important. It would be important if the cards were of the same performances at a similar price... but it isn't.

    At this point, you can probably undervolt the RX 570 pretty close to the 1650 if that was sooooo important...

    I made the calculation that it is going to cost you 15-20$ of power per year for playing 4 hours per day. You cannot defend this. It is insanity.

    https://www.youtube.com/watch?v=um63-_YPNcA

    https://youtu.be/WTaSIG5Z-HM
  • yannigr2 - Thursday, May 9, 2019 - link

    AMD in all those last years, is trying to defend it's position with smaller profit margins. It's not something that is doing now and it's not something that is doing only with RX 570, to make us question it's ability to maintain this price.

    One other thing is that, while in the review the GTX 1650 is tested against the 4GB RX 570, when there is something to be said about pricing and profit margings and questions about the ability of AMD to keep selling the RX 570 under $150, the 8GB model of the RX 580 is used. No mentioning of the much cheaper 4GB version that is used in the review.

    In the end of the day, RX 570 is not 11% faster for the same price. It's 11% faster for $30 less and the only question is if GTX 1650's power efficiency and a couple of other features are enough to justify the loss of 11% performance(or more if the RX 570 model was not overclocked) and a significant(for this price range) higher price tag.

    And no, we can't assume that in the near future AMD's prices will just jump 20% to make the GTX 1650 less of an expensive card. Especially when Navi is not far away, meaning that older hardware will have to be sold to make room for the new models, or just stay at those low prices to not interfere with newer Navi models that could come at $200 and up.
  • yannigr2 - Thursday, May 9, 2019 - link

    EDIT - clarification: In many tests, there are scores for the RX 570 4GB and not for the 8GB model.
  • catavalon21 - Saturday, May 4, 2019 - link

    In the 1660 and 1660Ti reviews, the RX 570 wasn't included; however, the RX 590 and RX 580 are shown taking 201 and 222 seconds respectively to complete the V-Ray benchmark 1.0.8, where this chart shows the RX 570 only taking 153 seconds. The GTX 1660 is shown taking 109 seconds in both that chart and this one. Since the 570 typically falls short of its 580/590 siblings, how did it manage to stomp them in this benchmark?

    https://www.anandtech.com/show/14071/nvidia-gtx-16...
  • GreenReaper - Saturday, May 4, 2019 - link

    I think this is a reasonable review. Using twice the power at maximum load is not an insignificant factor over the life of the card. But it depends on if additional heat means a cost, or just means you can run your heating less, how often you game, who is paying for your power, etc. Then there are factors such as Linux source driver support, which may or may not matter for a particular person.

    If pressed, I'd get the RX 570 in a heartbeat, but maybe not if I wanted to put it in my microserver (admittedly, I'd also need a low-profile card for that). But I'd rather wait for Navi in an APU. :-)
  • Koenig168 - Saturday, May 4, 2019 - link

    The article tries too hard to make Nvidia look good despite the GTX 1650 being inferior in performance compared to the RX 570 and overpriced for what it is offering.
  • Oxford Guy - Saturday, May 4, 2019 - link

    The last time I remember any major tech news site give Nvidia any grief was in the Fermi days, with the 480 and especially the 465. As bad as the 480 was, too, people still bragged about their triple 480 SLI systems and 480 dual SLI was routinely featured in benchmarks.
  • Haawser - Sunday, May 5, 2019 - link

    4GB RX 570 is $130, not $150. And beats the 4GB 1650 out of sight. It also only draws ~120W, which is not a lot seeing as the majority of 1650s (ie- those with a 6 pin) draw ~90W anyway.

    The actual 75W 1650s should be $99, and the rest shouldn't even exist. Because at $150-160 they are a complete and utter joke.

Log in

Don't have an account? Sign up now