Power, Temperature, & Noise

Last, but not least of course, is our look at power, temperatures, and noise levels. While a high performing card is good in its own right, an excellent card can deliver great performance while also keeping power consumption and the resulting noise levels in check.

GeForce Video Card Voltages
1650S Max 1660 Max 1650S Idle 1660 Idle
1.05v 1.05v 0.65v 0.65v

If you’ve seen one TU116 card, then you’ve seen them all as far as voltages are concerned. Even with this cut-down part, NVIDIA still lets the GTX 1650 Super run at up to 1.05v, allowing it to boost as high as 1950MHz.

GeForce Video Card Average Clockspeeds
Game GTX 1660 GTX 1650 Super GTX 1650
Max Boost Clock 1935MHz 1950MHz 1950MHz
Boost Clock 1785MHz 1725MHz 1695MHz
Shadow of the Tomb Raider 1875MHz 1860MHz 1845MHz
F1 2019 1875MHz 1875MHz 1860MHz
Assassion's Creed: Odyssey 1890MHz 1890MHz 1905MHz
Metro: Exodus 1875MHz 1875MHz 1860MHz
Strange Brigade 1890MHz 1860MHz 1860MHz
Total War: Three Kingdoms 1875MHz 1890MHz 1875MHz
The Division 2 1860MHz 1830MHz 1800MHz
Grand Theft Auto V 1890MHz 1890MHz 1905MHz
Forza Horizon 4 1890MHz 1875MHz 1890MHz

Meanwhile the clockspeed situation looks relatively good for the GTX 1650 Super. Despite having a 20W lower TDP than the GTX 1660 and an official boost clock 60MHz lower, in practice our GTX 1650 Super card is typically within one step of the GTX 1660. This also keeps it fairly close to the original GTX 1650, which boosted higher in some cases and lower in others. In practice this means that the performance difference between the three cards is being driven almost entirely by the differences in CUDA core counts, as well as the use of GDDR6 in the GTX 1650 Super. Clockspeeds don’t seem to be a major factor here.

Idle Power Consumption

Load Power Consumption - Shadow of the Tomb Raider

Load Power Consumption - FurMark

Shifting to power consumption, we see the cost of the GTX 1650 Super’s greater performance. It’s well ahead of the GTX 1650, but it’s pulling more power in the process. In fact I am a bit surprised by just how close it is (at the wall) to the GTX 1660, especially under Tomb Raider. While on paper it has a 20W lower TDP, in practice it actually fares a bit worse than the next level TU116 card. It’s only under FurMark, a pathological use case, that we see the GTX 1650 Super slot in under the GTX 1660. The net result is that the GTX 1650 Super seems to be somewhat inefficient, at least by NVIDIA standards. It doesn’t save a whole lot of power versus the GTX 1660 series, despite the lower performance.

Which also means it doesn’t fare especially well against the Radeon RX 5500 XT series. As with the GTX 1660, the RX 5500 XT is drawing less power than the GTX 1650 Super under Tomb Raider. It’s only by maxing out all of the cards with FurMark that the GTX 1650 Super pulls ahead. In practice I expect real world conditions to be between these two values – Tomb Raider may be a bit too hard on these low-end cards – but regardless, this would put GTX 1650 Super only marginally ahead of RX 5500 XT.

Idle GPU Temperature

Load GPU Temperature - Shadow of the Tomb Raider

Load GPU Temperature - FurMark

Looking at GPU temperatures, Zotac’s GTX 1650 Super card puts up decent numbers. While the 100W card understandably gets warmer than it’s 75W GTX 1650 sibling, we never see the GPU temperatures cross 70C. The card is keeping plenty cool.

Idle Noise Levels

Load Noise Levels - Shadow of the Tomb Raider

Load Noise Levels - FurMark

But when it comes time to measure how much noise the card is producing, we find a different picture. In order to keep the GPU at 69C, the Zotac GTX 1650 Super’s fans are having to do some real work. And unfortunately, the small 65mm fans just aren’t very quiet once they have to spin up. To be sure, the fans as a whole aren’t anywhere near max load – we recorded just 55% as reported by NVIDIA’s drivers – however this is still enough to push noise levels over 50 dB(A).

Zotac’s single-fan GTX 1650 card didn’t fare especially well here either, but by the time we reach FurMark, the GTX 1650 Super does even worse. It’s louder than any GTX 1660 card we’ve tested, even marginally exceeding the GTX 1660 Super.

All of this makes for an interesting competitive dichotomy given last week’s launch of the Radeon RX 5500 XT. The card we test there, Sapphire’s Pulse RX 5500 XT, is almost absurd in how overbuilt it is for a 130W product, with a massive heatsink and equally massive fans. But it moves more heat than Zotac’s GTX 1650 Super with a fraction of the noise.

If nothing else, this is a perfect example of the trade-offs that Sapphire and Zotac made with their respective cards. Zotac opted to maximize compatibility so that the GTX 1650 Super would fit in virtually any machine that the GTX 1650 (vanilla) can fit in, at the cost of having to use a relatively puny cooler. Sapphire went the other direction, making a card that’s hard to fit in some machines, but barely has to work at all to keep itself cool. Ultimately neither approach is the consistently better one – despite its noise advantage, the Sapphire card’s superiority ends the moment it can’t fit in a system – underscoring the need for multiple partners (or at least m multiple board designs). Still, it’s hard to imagine that Zotac couldn’t have done at least a bit better here; a 50 dB(A) card is not particularly desirable, especially for as something as low-powered as a GTX 1650 series card.

Synthetics Final Words
POST A COMMENT

67 Comments

View All Comments

  • Korguz - Sunday, December 22, 2019 - link

    why do you think the games will target ps4 ?? is this just your own opinion?? Reply
  • Kangal - Sunday, December 22, 2019 - link

    Because there's a lot of PS4 units hooked up to TVs right now, there will still be hooked up until 2022. When the PS4 launched, the PS3 was slightly ahead of the Xbox 360, yet sales were nothing like the PS4's. And the PS3 was very outdated back in 2014, whereas in 2020, the PS4 is not nearly as outdated... so there's more longevity in there.

    So with all those factors and history, there's a high probability (certainty?) that Game Publishers will still target the PS4 as their baseline. This is good news for Gaming PC's with only 8GB RAM and 4GB VRAM, and performance below that of a RX 5700. Regardless, it's always easier to upgrade a PC's GPU than it is to upgrade the entire console.

    ...that's why Ryan is not quite right
    Reply
  • Korguz - Sunday, December 22, 2019 - link

    um yea ok sure... and you have numbers to confirm this ?? seems plausible, but also, just personal opinion Reply
  • Kangal - Monday, December 23, 2019 - link

    During the launch of the PS4 back in 2014, the older PS3 was 8 YEARS OLD at the time, and hadn't aged well, but it did a commendable sales of 85 Million consoles.

    I was surprised by the Xbox 360 which was 9.5 YEARS OLD, which understandably was more outdated, and it did a surprising sales of 75 Million consoles.

    Because both consoles weren't very modern/quite outdated, and marketing was strong, the initial sales of the PS4 and Xbox One were very strong in 2014. Despite this there was about another, 5 Million PS3 and Xbox 360, budget sales made in this period. And it took until Early-2016 for Game Publishers to ditch the PS3 and Xbox 360. So about 1.5 Years, and about 40 Million sales (PS4) or 25 Million sales (Xbox 360) later. During this period people using 2GB VRAM Graphic Cards (GTX 960, AMD R9 370X) were in the clear. Only after 2016 were they really outdated, but it was a simple GPU Swap for most people.

    So that's what happened, that's our history.
    Now let's examine the current/upcoming events!
    The PS4 has sold a whopping 105 Million consoles, and the Xbox One has a commendable 50 Million units sold. These consoles should probably reach 110 Million and 55 Million respectively when the PS5 and Xbox X release. And within 2 years they will probably settle on a total of 120 Million and 60 Million sales total. That's a huge player base for companies to ignore, and is actually better than the previous generation. However, this current gen will have both consoles much less outdated than the previous gen, and it's understandable since both consoles will only be 6 YEARS OLD. So by the end of 2022, it should (will !!) be viable to use a lower-end card, something that "only" has 4GB VRAM such as the RX 5500XT or the GTX 1650-Super. And after that, it's a simple GPU Swap to fix that problem anyway so it's no big deal.

    Ryan thinks these 4GB VRAM cards will be obsolete within 6 Months. He's wrong about the timing. It should take 2 Years, or about x4 as much time. If he or you disagree, that's fine, but I'm going off past behavior and other factors. I will see Ryan in 6 Months and see if he was right or wrong.... if I remember to revisit this article/comment that is : )
    Reply
  • Korguz - Monday, December 23, 2019 - link

    and yet... i know some friends that sold their playstations.. and got xboxes... go figure....
    for game makers to make a game for a console to port it to a comp = a crappy game for the most part.. supreme commander 2, is a prime example of this....
    Reply
  • flyingpants265 - Sunday, December 22, 2019 - link

    Most benchmarks on this site are pretty bad and missing a lot of cards.

    Bench is OK but the recent charts are missing a lot of cards and a lot of tests.

    Pcpartpicker is working on a better version of bench, they've got dozens of PCs running benchmarks, 24/7 year-round, to test every possible combination of hardware and create a comprehensive benchmark list. Kind of an obvious solution, and I'm surprised nobody has bothered to do this for... 20-30 years or longer..
    Reply
  • Korguz - Sunday, December 22, 2019 - link

    hmmmmmm could it be because of, oh, let me guess... cost ????????????????? Reply
  • sheh - Saturday, December 21, 2019 - link

    In the buffer compression tests the 1650S fares worse than both the non-S cards and the 1050 Ti.
    How come?

    Curiously, the 1660S is even worse than the 1650S.
    Reply
  • catavalon21 - Saturday, December 21, 2019 - link

    Guessing it's ratio differences not rated to absolute performance. A more comprehensive chart in BENCH of the INT8 Buffer Compression test shows the 2080Ti with a far lower score than any of the recent mid-range offerings.

    https://www.anandtech.com/bench/GPU19/2690
    Reply
  • catavalon21 - Sunday, December 22, 2019 - link

    * not related to Reply

Log in

Don't have an account? Sign up now