The NVIDIA GeForce GTX 1650 Review, Feat. Zotac: Fighting Brute Force With Power Efficiency
by Ryan Smith & Nate Oh on May 3, 2019 10:15 AM ESTBattlefield 1 (DX11)
Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing.
We use the Ultra, High, and Medium presets is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).
Without a direct competitor in the current generation, the GTX 1650's intended position is somewhat vague, outside of iterating on Pascal's GTX 1050 variants. Looking back to Pascal's line-up, the GTX 1650 splits the difference between the GTX 1050 Ti and GTX 1060 3GB, and far from the GTX 1660.
Compared to the RX 570 though, the GTX 1650 is handily outpaced, and Battlefield 1 where the GTX 1650 is the furthest behind. That being said, the RX 570 wasn't originally in this segment, with price being the common denominator. The RX 460, meanwhile, is well-outclassed, and the additional 2 CUs in the RX 560 would be unlikely to significantly narrow the gap.
As for the ZOTAC card, the 30 MHz is an unnoticable difference in real world terms.
126 Comments
View All Comments
philehidiot - Friday, May 3, 2019 - link
Over here, it's quite routine for people to consider the efficiency cost of using AC in a car and whether it's more sensible to open the window... If you had a choice over a GTX1080 and Vega64 which perform nearly the same, assume they cost nearly the same, then you'd take into account one requires a small nuclear reactor to run whilst the other is probably more energy sipping than your current card. Also, some of us are on this thing called a budget. $50 saving is a weeks food shopping.JoeyJoJo123 - Friday, May 3, 2019 - link
Except your comment is exactly in line with what I said:"Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650"
I'm not saying power use of the GPU is irrelevant, I'm saying performance/price is ultimately more important. The RX 570 GPU is not only significantly cheaper, but it outperforms the GTX 1650 is most scenarios. Yes, the RX 570 does so by consuming more power, but it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance.
Absolutely, a GTX1080 is a smarter buy compared the to the Vega64 given the power consumption, but that's because power consumption was the tie breaker. The comparison wouldn't be as ideal for the GTX1080 if it costed 30% more than the Vega64, offered similar performance, but came with the long term promise of ~eventually~ paying for the upfront difference in cost with a reduction in power cost.
Again, the sheer majority of users on the market are looking for best performance/price, and the GTX1650 outpriced itself out of the market it should be competing with.
Oxford Guy - Saturday, May 4, 2019 - link
"it'd take 2 or so years of power bills (at least according to avg American power bill per month) to split an even cost with the GTX 1650, and even at that mark where the cost of ownership is equivalent, the RX 570 still has provided 2 years of consistently better performance, and will continue to offer better performance."This.
Plus, if people are so worried about power consumption maybe they should get some solar panels.
Yojimbo - Sunday, May 5, 2019 - link
Why in the world would you get solar panels? That would only increase the cost even more!Karmena - Tuesday, May 7, 2019 - link
So, you multiplied it once, why not multiply that value again. and make it 100$?Gigaplex - Sunday, May 5, 2019 - link
Kids living with their parents generally don't care about the power bill.gglaw - Sunday, May 5, 2019 - link
wrong on so many levels. If you find the highest cost electricity city in the US, plug in the most die hard gamer who plays only new games on max settings that runs GPU at 100% load at all times, and assume he plays more hours than most people work you might get close to those numbers. The sad kid who fits the above scenario games hard enough he would never choose to get such a bad card that is significantly slower than last gen's budget performers (RX 570 and GTX 1060 3GB). Kids in this scenario would not be calculating the nickels and dimes he's saving here and there - they'd would be getting the best card in their NOW budget without subtracting the quarter or so they might get back a week. You're trying to create a scenario that just doesn't exist. Super energy conscious people logging every penny of juice they spend don't game dozens of hours a week and would be nit-picky enough they would probably find settings to save that extra 2 cents a week so wouldn't even be running their GPU at 100% load.PeachNCream - Friday, May 3, 2019 - link
Total cost of ownership is a significant factor in any buying decision. Not only should one consider the electrical costs of a GPU, but indirect additional expenses such as air conditioning needs or reductions in heating costs offset by heat output along with the cost to upgrade at a later date based on the potential for dissatisfaction with future performance. Failing to consider those and other factors ignores important recurring expenses.Geranium - Saturday, May 4, 2019 - link
Then people need to buy Ryzen R7 2700X than i9 9900K. As 9900K use more power, runs hot so need more powerful cooler and powerful cooler use more current compared to a 2700X.nevcairiel - Saturday, May 4, 2019 - link
Not everyone puts as much value on cost as others. When discussing a budget product, it absolutely makes sense to consider, since you possibly wouldn't buy such a GPU if money was no object.But if someone buys a high-end CPU, the interests shift drastically, and as such, your logic makes no sense anymore. Plenty people buy the fastest not because its cheap, but because its the absolutely fastest.