Final Words

For our final video card review of the year, the GeForce GTX 1650 Super admittedly doesn’t bring any surprises. But then it didn’t need to. As the 4th TU116 card released this year alone, and the 12th Turing GeForce card overall, we generally have a good grip on what the current generation of GeForce cards can do. All that’s really needed to change was for NVIDIA to get a better grip on what the market is looking for in performance for a sub-$200 card, and with the GTX 1650 Super they’re finally doing that.

Looking at things on a pure performance basis, the GTX 1650 Super makes significant strides over the GTX 1650, never mind NVIDIA’s last-generation cards. As it should: it offers a very hefty increase in the number of CUDA cores as well as memory bandwidth, and even with the TU116 GPU in its cut-down state for this card, it’s still just a whole lot more GPU than the TU117 in the GTX 1650. NVIDIA has thrown significantly more hardware at this segment, and the net result is a big performance boost.

And just in time as well. The launch of AMD’s rival Radeon RX 5500 XT immediately took the original GTX 1650 down a peg, as AMD’s card offered a lot more performance for only a small increase in price. So one way or another, the GTX 1650 Super is NVIDIA’s counter-play to AMD’s new card.

Performance Summary (1080p)
  Relative Performance Relative
Price
Relative
Perf-Per-Dollar
GTX 1650S vs GTX 1650 +33% +7% +25%
GTX 1650S vs GTX 1660 -11% -24% +16%
GTX 1650S vs GTX 1050 Ti +77% +14% +55%
GTX 1650S vs RX 5500 XT 4GB 0% -6% +6%

The outcome is surprisingly even-handed between AMD and NVIDIA. With regards to framerates, the $159 GTX 1650 Super and the $169 RX 5500 XT 4GB are at a dead heat. As is usually the case, the cards are anything but equal on a game-by-game basis, constantly trading wins and losses, but at the end of the day they’re fighting over the same market with the same performance. NVIDIA has a slight edge in price, and perhaps an even slighter edge in overall energy efficiency, but that’s it. As a result, the GTX 1650 Super is significantly better than the GTX 1650, but it’s not able to meaningfully pull ahead of AMD’s card.

Meanwhile, although the original GTX 1650 is now well outclassed in terms of performance, the card isn’t going anywhere, and for good reason. It remains NVIDIA’s best option for the 75W market. GTX 1650 Super improves on performance by almost 33%, but it requires just as much additional power to get there. For system builds that aren’t sensitive to power/thermal needs, then this isn’t going to matter, and the GTX 1650 Super is well worth the $10 price premium. Otherwise the original GTX 1650 is still the best card available for the 75W, maximum compatibility PCIe-slot-power-only market.

Though like last week’s Radeon review, I’ll also note here my general hesitation with cards that have 4GB of VRAM. VRAM isn’t cheap, and GDDR6 even less so, so both vendors are using VRAM capacity as product differentiators and to upsell their better cards. But as VRAM capacity in the $150-$200 price range has been pretty stagnant for the last couple of years now, I do have some concerns about the long-term implications for 4GB cards, especially with the next-generation consoles set to launch in a year’s time. With the consoles setting the baseline for most multiplatform games, it’s a reasonable bet that VRAM requirements aren’t going to stay put at 4GB much longer.

Unlike AMD, the situation isn’t quite as black and white in the NVIDIA ecosystem, as NVIDIA doesn’t offer an 8GB GTX 1650 Super – or even an 8GB GTX 1660 series card for that matter. So the answer can’t just be “buy the 8GB card” like it is with the RX 5500 XT. Still, along with offering better performance, the GTX 1660 cards and their 6GB of VRAM stand a better chance of delivering solid, unimpeded gaming performance in a year or two’s time. At the end of the day, I don’t think any 4GB cards are a great choice right now; if you can afford to go higher, you should.

Finally, we have the particularities of Zotac’s card, the Zotac Gaming GeForce GTX 1650 Super. Zotac is one of NVIDIA’s most regular and reliable partners, and it shows, with the company producing a card that may just as well be the official GTX 1650 Super reference card. It is as good of an example of the baseline GTX 1650 Super experience as one could hope for, right on down to the fact that it’ll fit into virtually any PC.

But like it’s predecessor, the original Zotac GTX 1650, I remain unimpressed with the coolers on Zotac’s GTX 1650 series cards. We have seen and tested other similar double-slot cards before that are quieter, including cards from Zotac. So for the Zotac GTX 1650 Super land among the loudest of our low-end cards is unfortunate. While I admire Zotac’s commitment to making such a small card, I can’t help but think that, if nothing else, a single, larger fan would have fared better. Low-end cards are always a design challenge in terms of profit margins, but I think there’s room here for Zotac to do better, even for a baseline card like their Gaming GTX 1650 Super.

 
Power, Temperature, & Noise
Comments Locked

67 Comments

View All Comments

  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one.
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already.
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ???
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO.
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.

Log in

Don't have an account? Sign up now