Meet The ZOTAC GAMING GeForce GTX 1650 OC

In what's becoming a theme of the GTX 16-series, the GeForce GTX 1650 is once again a pure virtual launch, where NVIDIA is not going for any Founders Edition models and all cards are up to their add-in board partners. For today's review, we take a look at ZOTAC's GeForce GTX 1650 OC, a diminutive 2-slot single-fan card with reference base clockspeed and mildly overclocked boost clock. With a TDP of 75W, the card pulls all its power from the slot, with is typical for most GeForce GTX xx50 parts.

GeForce GTX 1650 Card Comparison
  GTX 1650
(Reference Specification)
ZOTAC GTX 1650 GAMING OC
Base Clock 1485MHz 1485MHz
Boost Clock 1665MHz 1695MHz
Memory Clock 8Gbps GDDR5 8Gbps GDDR5
VRAM 4GB 4GB
TDP 75W 75W
Length N/A 5.94"
Width N/A 2-Slot
Cooler Type N/A Open Air
Price $149 $149

At just under 6", the Zotac GTX 1650 OC is compact enough most builds. As the card pulls power only from the PCIe slot, it's a conventional fit for mITX and other SFF builds, or simply as a no-fuss drop-in replacement. In turn, the Zotac GTX 1650 OC's cooling solution is one they've used before with their other mini ITX cards, combining a 90mm fan and 'sunflower' heatsink. This also provides headroom for ZOTAC to put a modest boost increase of 30MHz.

 

The design/shroud and output situation is likewise similar. One DVI port, one HDMI 2.0b port, and one DisplayPort covers all bases, including potential HTPC use. Of course, partners can always decide on different configurations but the power/cost-sensitive entry-level range is essentially standardized. VirtualLink is naturally not included here for several reasons, and in perspective the 30W USB-C controller power budget for VirtualLink would be 40% of the overall 75W TDP.

For overclocking and tweaking, ZOTAC has their in-house Firestorm utility updated for Turing, including support for auto-OC scanning as part of Turing's GPU Boost 4 technology.

 
TU117: The Smallest Turing Gets Volta’s Video Encoder? The Test
Comments Locked

126 Comments

View All Comments

  • PeachNCream - Tuesday, May 7, 2019 - link

    Agreed with nevc on this one. When you start discussing higher end and higher cost components, consideration for power consumption comes off the proverbial table to a great extent because priority is naturally assigned moreso to performance than purchase price or electrical consumption and TCO.
  • eek2121 - Friday, May 3, 2019 - link

    Disclaimer, not done reading the article yet, but I saw your comment.

    Some people look for low wattage cards that don't require a power connector. These types of cards are particularly suited for MiniITX systems that may sit under the TV. The 750ti was super popular because of this. Having Turings HEVC video encode/decode is really handy. You can put together a nice small MiniITX with something like the Node 202 and it will handle media duties much better than other solutions.
  • CptnPenguin - Friday, May 3, 2019 - link

    That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder for cost saving or some other Nvidia-Alone-Knows reason. (source: Hardware Unboxed and Gamer's Nexus).

    Anyone buying a 1650 and expecting to get the Turing video encoding hardware is in for a nasty surprise.
  • Oxford Guy - Saturday, May 4, 2019 - link

    "That would be great if it actually had the Turing HVEC encoder - it does not; it retains the Volta encoder"

    Yeah, lack of B support stinks.
  • JoeyJoJo123 - Friday, May 3, 2019 - link

    Or if you're going with a miniITX low wattage system, you can cut out the 75w GPU and just go with a 65w AMD Ryzen 2400G since the integrated Vega GPU is perfectly suitable for an HTPC type system. It'll save you way more money with that logic.
  • 0ldman79 - Sunday, May 19, 2019 - link

    What they are going to do though is look at the fast GPU + PSU vs the slower GPU alone.

    People with OEM boxes are going to buy one part at a time. Trust me on this, it's frustrating, but it's consistent.
  • Gich - Friday, May 3, 2019 - link

    25$ a year? So 7cents a day?
    7cents is more then 1kWh where I live.
  • Yojimbo - Friday, May 3, 2019 - link

    The us average is a bit over 13 cents per kilowatt hour. But I made an error in the calculation and was way off. It's more like $15 over 2 years and not $50. Sorry.
  • DanNeely - Friday, May 3, 2019 - link

    That's for an average of 2h/day gaming. Bump it up to a hard core 6h/day and you get around $50/2 years. Or 2h/day but somewhere with obnoxiously expensive electricity like Hawaii or Germany.
  • rhysiam - Saturday, May 4, 2019 - link

    I'd just like to point out that if you've gamed for an average of 6h per day over 2 years with a 570 instead of a 1650, then you've also been enjoying 10% or so extra performance. That's more than 4000 hours of higher detail settings and/or frame rates. If people are trying to calculate the true "value" of a card, then I would argue that this extra performance over time, let's not forget the performance benefits!

Log in

Don't have an account? Sign up now