Closing Thoughts

Wrapping up our look at what would seem to be the last GeForce desktop video card launch for the time being, the GeForce GTX 1650 caps off the Turing launch cycle in an interesting manner. On the one hand, NVIDIA’s pure technological dominance has never been quite so prominent as it is with the GTX 1650. On the other hand, their outright value of their lead over rival AMD has never been quite so muddled. As I noted earlier in this review, the GTX 1650 is a card that serves many masters, and it serves some better than others.

Overall, NVIDIA is treading in very familiar territory here, thanks to their well-defined product stack. The GTX 1650 is a low-end video card. It’s a low-priced video card. It’s a low-power video card. It’s a video card that no doubt will be filling OEM gaming systems left and right as the stock video card – and it’ll be a card that fills countless older OEM systems that need a pick-me-up that runs under 75W. It fills all of these roles well – as a GTX xx50 card should – but it’s also a card that definitely faces stiffer competition than the other members of the Turing GeForce family.

From a tech perspective then, GTX 1650 and its underlying TU117 GPU are another example of consistent execution on Turing GPU development by NVIDIA. By this point the Turing architecture is a known quantity – faster, more efficient, and packing a number of new graphics features – so for our regular readers who have been with us since the RTX 2080 launch, the GTX 1650 doesn’t get the luxury of delivering any big surprises here. But then “big” is the very opposite of what NVIDIA aimed to do with the GTX 1650; instead this launch is all about bringing the Turing architecture and its benefits down to their smallest GPU and video cards.

By the numbers then, the GeForce GTX 1650’s story is very similar to this year’s other GeForce launches. The $149 card is about 30% faster than its most comparable predecessor, the GTX 1050 Ti, which is just a bit under the generational performance gains we’ve seen from the other Turing cards. Like those other cards the performance gains aren’t nearly large enough to justify replacing an existing GeForce 10 series Pascal card, but it’s a far more enticing upgrade for replacing the GTX 750, GTX 950, and similar cards that are now a couple of generations old. Against the GTX 950 the new GTX 1650 is 63% faster, and compared to the GTX 750 Ti that’s a 111% performance lead.

GeForce: Turing versus Pascal
  List Price
(Turing)
Relative Performance Relative
Price
Relative
Perf-Per-Dollar
RTX 2080 Ti vs GTX 1080 Ti $999 +32% +42% -7%
RTX 2080 vs GTX 1080 $699 +35% +40% -4%
RTX 2070 vs GTX 1070 $499 +35% +32% +2%
RTX 2060 vs GTX 1060 6GB $349 +59% +40% +14%
GTX 1660 Ti vs GTX 1060 6GB $279 +36% +12% +21%
GTX 1660 vs GTX 1060 3GB $219 +28% +10% +16%
GTX 1650 vs GTX 1050 Ti $149 +30% +7% +21%

These performance gains also mean that the GTX 1650 is assuming the mantle as the best sub-75W video card on the market. The 30% performance gain over the previous holder, the GTX 1050 Ti, comes with only the slightest increase in power consumption – and easily staying under the 75W cap – making it the card to get for TDP-constrained systems. And HTPC users will find that it can decode every format thrown at it, from VP9 Profile 2 to all sorts of permutations of HEVC, making it a great candidate for driving the latest generation of HDR TVs. Just don't make plans to do HEVC encoding with the card if you care about bitrate efficiency.

All told, NVIDIA has ruled the roost for a while now in the 75W space, and the GTX 1650 only further widens this gap. NVIDIA cotinues to hold an edge on features, all the while enjoying a staggering 70% performance advantage over AMD’s most comparable 75W Radeon RX 560 cards.

But for everything going for it from a technology perspective, the GTX 1650 does face one big hurdle: AMD’s tenacity and willingness to sell GPUs with a thinner profit margin. While the GTX 1650 handily disposes of the Radeon RX 560, the Radeon RX 570 is another matter. An outright mid-range card that has seen its price knocked down to under $150, the RX 570 brings more of everything to the table. More performance, more memory, more memory bandwidth, more bundled games, and more power consumption. AMD can’t match NVIDIA on features or architectural efficiency at this point, but they can sure undercut NVIDIA’s pricing, and that’s exactly what the company has opted to do.

The end result is that while the GTX 1650 is easily the best sub-75W card on the market, it’s a whole different game once power consumption (and power efficiency) go out the window. On a pure performance basis AMD’s petite Polaris card offers around 11% better performance than the GTX 1650, and if you’re buying a video card in this price range, then that’s more than enough of a difference to take notice. The GTX 1650 may be technologically superior here, but if all you’re after is the best performance for the price then the decision is an easy one to make, and AMD is happy to win on sheer brute force.

It’s an unusual way to cap off the GeForce Turing launch, to say the least: NVIDIA, as it turns out, is both at its most and least competitive at the very bottom of its product stack. But with that said, NVIDIA could always cut the price of the GTX 1650 to be more in line with its performance – essentially spoiling AMD’s RX 570 on price. However, given the GTX 1650’s other strengths – not to mention NVIDIA’s significant OEM and branding advantages – I seriously doubt that NVIDIA has much incentive to do that. Instead it looks like NVIDIA is content to let AMD swing the RX 570 around, at least for now. However it bears noting that the GTX 1650 is not a fully-enabled TU117 card, and while I don’t expect a theoretical GTX 1650 Ti any time soon, at some point NVIDIA is going to roll that out and rebalance the market once more.

Last but not least, let’s shift gears and talk about the specific GTX 1650 we reviewed today, Zotac’s GAMING GeForce GTX 1650 OC. Zotac’s sole GTX 1650 card, the GAMING OC is competing in a market full of GTX 1650 cards from other board partners, and yet amongst all of those cards it’s arguably one of the purest GTX 1650 cards on the market. Despite NVIDIA’s intentions for the GTX 1650, most of their board partners went and built factory overclocked cards that blow right past the GTX 1650’s reference TDP of 75W, negating several of the GTX 1650’s advantages. Zotac’s GAMING GeForce GTX 1650 OC, by comparison, is a true 75W card, putting it in rare company as one of the only GTX 1650 cards that can actually be installed in virtually any system and powered entirely by the PCIe slot.

The end result is that, even with the extremely mild factory overclock, Zotac’s GAMING OC card is a solid baseline GTX 1650 card. The compact card doesn’t require an external PCIe power plug, and as a result can be dropped in almost any system. And at 5.54” long, it’s also among the shortest GTX 1650 cards on the market, letting it easily squeeze into smaller systems, including Mini-ITX machines.

In fact the only real drawback I can come up with for the card is its noise; while by no means loud, we have seen and tested other similar double-slot/single-fan cards before that are quieter. So while it’s still a solid choice for small systems, it’s not going to be an entirely silent card in the process. But if nothing else, this leaves Zotac with some room for a fully passive GTX 1650, which, I suspect, is something that would be particularly well-received in the HTPC market.

 
Power, Temperature, and Noise
Comments Locked

126 Comments

View All Comments

  • philehidiot - Friday, May 3, 2019 - link

    Over here, it's quite routine for people to consider the efficiency cost of using AC in a car and whether it's more sensible to open the window... If you had a choice over a GTX1080 and Vega64 which perform nearly the same, assume they cost nearly the same, then you'd take into account one requires a small nuclear reactor to run whilst the other is probably more energy sipping than your current card. Also, some of us are on this thing called a budget. $50 saving is a weeks food shopping.
  • JoeyJoJo123 - Friday, May 3, 2019 - link

    Except your comment is exactly in line with what I said:
    "Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650"

    I'm not saying power use of the GPU is irrelevant, I'm saying performance/price is ultimately more important. The RX 570 GPU is not only significantly cheaper, but it outperforms the GTX 1650 is most scenarios. Yes, the RX 570 does so by consuming more power, but it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance.

    Absolutely, a GTX1080 is a smarter buy compared the to the Vega64 given the power consumption, but that's because power consumption was the tie breaker. The comparison wouldn't be as ideal for the GTX1080 if it costed 30% more than the Vega64, offered similar performance, but came with the long term promise of ~eventually~ paying for the upfront difference in cost with a reduction in power cost.

    Again, the sheer majority of users on the market are looking for best performance/price, and the GTX1650 outpriced itself out of the market it should be competing with.
  • Oxford Guy - Saturday, May 4, 2019 - link

    "it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance."

    This.

    Plus, if people are so worried about power consumption maybe they should get some solar panels.
  • Yojimbo - Sunday, May 5, 2019 - link

    Why in the world would you get solar panels? That would only increase the cost even more!
  • Karmena - Tuesday, May 7, 2019 - link

    So, you multiplied it once, why not multiply that value again. and make it 100$?
  • Gigaplex - Sunday, May 5, 2019 - link

    Kids living with their parents generally don't care about the power bill.
  • gglaw - Sunday, May 5, 2019 - link

    wrong on so many levels. If you find the highest cost electricity city in the US, plug in the most die hard gamer who plays only new games on max settings that runs GPU at 100% load at all times, and assume he plays more hours than most people work you might get close to those numbers. The sad kid who fits the above scenario games hard enough he would never choose to get such a bad card that is significantly slower than last gen's budget performers (RX 570 and GTX 1060 3GB). Kids in this scenario would not be calculating the nickels and dimes he's saving here and there - they'd would be getting the best card in their NOW budget without subtracting the quarter or so they might get back a week. You're trying to create a scenario that just doesn't exist. Super energy conscious people logging every penny of juice they spend don't game dozens of hours a week and would be nit-picky enough they would probably find settings to save that extra 2 cents a week so wouldn't even be running their GPU at 100% load.
  • PeachNCream - Friday, May 3, 2019 - link

    Total cost of ownership is a significant factor in any buying decision. Not only should one consider the electrical costs of a GPU, but indirect additional expenses such as air conditioning needs or reductions in heating costs offset by heat output along with the cost to upgrade at a later date based on the potential for dissatisfaction with future performance. Failing to consider those and other factors ignores important recurring expenses.
  • Geranium - Saturday, May 4, 2019 - link

    Then people need to buy Ryzen R7 2700X than i9 9900K. As 9900K use more power, runs hot so need more powerful cooler and powerful cooler use more current compared to a 2700X.
  • nevcairiel - Saturday, May 4, 2019 - link

    Not everyone puts as much value on cost as others. When discussing a budget product, it absolutely makes sense to consider, since you possibly wouldn't buy such a GPU if money was no object.

    But if someone buys a high-end CPU, the interests shift drastically, and as such, your logic makes no sense anymore. Plenty people buy the fastest not because its cheap, but because its the absolutely fastest.

Log in

Don't have an account? Sign up now