Power, Temperature, & Noise

Last, but not least of course, is our look at power, temperatures, and noise levels. While a high performing card is good in its own right, an excellent card can deliver great performance while also keeping power consumption and the resulting noise levels in check.

GeForce Video Card Voltages
1650S Max 1660 Max 1650S Idle 1660 Idle
1.05v 1.05v 0.65v 0.65v

If you’ve seen one TU116 card, then you’ve seen them all as far as voltages are concerned. Even with this cut-down part, NVIDIA still lets the GTX 1650 Super run at up to 1.05v, allowing it to boost as high as 1950MHz.

GeForce Video Card Average Clockspeeds
Game GTX 1660 GTX 1650 Super GTX 1650
Max Boost Clock 1935MHz 1950MHz 1950MHz
Boost Clock 1785MHz 1725MHz 1695MHz
Shadow of the Tomb Raider 1875MHz 1860MHz 1845MHz
F1 2019 1875MHz 1875MHz 1860MHz
Assassion's Creed: Odyssey 1890MHz 1890MHz 1905MHz
Metro: Exodus 1875MHz 1875MHz 1860MHz
Strange Brigade 1890MHz 1860MHz 1860MHz
Total War: Three Kingdoms 1875MHz 1890MHz 1875MHz
The Division 2 1860MHz 1830MHz 1800MHz
Grand Theft Auto V 1890MHz 1890MHz 1905MHz
Forza Horizon 4 1890MHz 1875MHz 1890MHz

Meanwhile the clockspeed situation looks relatively good for the GTX 1650 Super. Despite having a 20W lower TDP than the GTX 1660 and an official boost clock 60MHz lower, in practice our GTX 1650 Super card is typically within one step of the GTX 1660. This also keeps it fairly close to the original GTX 1650, which boosted higher in some cases and lower in others. In practice this means that the performance difference between the three cards is being driven almost entirely by the differences in CUDA core counts, as well as the use of GDDR6 in the GTX 1650 Super. Clockspeeds don’t seem to be a major factor here.

Idle Power Consumption

Load Power Consumption - Shadow of the Tomb Raider

Load Power Consumption - FurMark

Shifting to power consumption, we see the cost of the GTX 1650 Super’s greater performance. It’s well ahead of the GTX 1650, but it’s pulling more power in the process. In fact I am a bit surprised by just how close it is (at the wall) to the GTX 1660, especially under Tomb Raider. While on paper it has a 20W lower TDP, in practice it actually fares a bit worse than the next level TU116 card. It’s only under FurMark, a pathological use case, that we see the GTX 1650 Super slot in under the GTX 1660. The net result is that the GTX 1650 Super seems to be somewhat inefficient, at least by NVIDIA standards. It doesn’t save a whole lot of power versus the GTX 1660 series, despite the lower performance.

Which also means it doesn’t fare especially well against the Radeon RX 5500 XT series. As with the GTX 1660, the RX 5500 XT is drawing less power than the GTX 1650 Super under Tomb Raider. It’s only by maxing out all of the cards with FurMark that the GTX 1650 Super pulls ahead. In practice I expect real world conditions to be between these two values – Tomb Raider may be a bit too hard on these low-end cards – but regardless, this would put GTX 1650 Super only marginally ahead of RX 5500 XT.

Idle GPU Temperature

Load GPU Temperature - Shadow of the Tomb Raider

Load GPU Temperature - FurMark

Looking at GPU temperatures, Zotac’s GTX 1650 Super card puts up decent numbers. While the 100W card understandably gets warmer than it’s 75W GTX 1650 sibling, we never see the GPU temperatures cross 70C. The card is keeping plenty cool.

Idle Noise Levels

Load Noise Levels - Shadow of the Tomb Raider

Load Noise Levels - FurMark

But when it comes time to measure how much noise the card is producing, we find a different picture. In order to keep the GPU at 69C, the Zotac GTX 1650 Super’s fans are having to do some real work. And unfortunately, the small 65mm fans just aren’t very quiet once they have to spin up. To be sure, the fans as a whole aren’t anywhere near max load – we recorded just 55% as reported by NVIDIA’s drivers – however this is still enough to push noise levels over 50 dB(A).

Zotac’s single-fan GTX 1650 card didn’t fare especially well here either, but by the time we reach FurMark, the GTX 1650 Super does even worse. It’s louder than any GTX 1660 card we’ve tested, even marginally exceeding the GTX 1660 Super.

All of this makes for an interesting competitive dichotomy given last week’s launch of the Radeon RX 5500 XT. The card we test there, Sapphire’s Pulse RX 5500 XT, is almost absurd in how overbuilt it is for a 130W product, with a massive heatsink and equally massive fans. But it moves more heat than Zotac’s GTX 1650 Super with a fraction of the noise.

If nothing else, this is a perfect example of the trade-offs that Sapphire and Zotac made with their respective cards. Zotac opted to maximize compatibility so that the GTX 1650 Super would fit in virtually any machine that the GTX 1650 (vanilla) can fit in, at the cost of having to use a relatively puny cooler. Sapphire went the other direction, making a card that’s hard to fit in some machines, but barely has to work at all to keep itself cool. Ultimately neither approach is the consistently better one – despite its noise advantage, the Sapphire card’s superiority ends the moment it can’t fit in a system – underscoring the need for multiple partners (or at least m multiple board designs). Still, it’s hard to imagine that Zotac couldn’t have done at least a bit better here; a 50 dB(A) card is not particularly desirable, especially for as something as low-powered as a GTX 1650 series card.

Synthetics Final Words
Comments Locked

67 Comments

View All Comments

  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one.
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already.
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ???
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO.
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.

Log in

Don't have an account? Sign up now