The Test

While the GeForce GTX 1650 is rolling out as a fully custom launch, the nature of the entry-level segment means that boards will be very similar across the board. For one, going beyond 75W TDP would require an external PCIe power connector. So the 75W ZOTAC GeForce GTX 1650 with boost clocks dialed 30MHz down to reference is a good approximation for a generic reference GTX 1650, allowing us to keep testing and analysis as apples-to-apples as possible. While not perfect, this should be reasonably accurate for a virtual reference card as we look at reference-to-reference comparisons.

Overall, as this primarily covers cards in the low- to mid-range, all game benchmarking is done at 1080p, looking at performance on our standard 1080p Ultra settings, as well as high and medium options that are better suited for these sorts of sub-$150 cards.

Test Setup
CPU Intel Core i7-7820X @ 4.3GHz
Motherboard Gigabyte X299 AORUS Gaming 7 (F9g)
PSU Corsair AX860i
Storage OCZ Toshiba RD400 (1TB)
Memory G.Skill TridentZ
DDR4-3200 4 x 8GB (16-18-18-38)
Case NZXT Phantom 630 Windowed Edition
Monitor LG 27UD68P-B
Video Cards ZOTAC GAMING GeForce GTX 1650 OC
NVIDIA GeForce GTX 1650 (4GB)

AMD Radeon RX 570 8GB
AMD Radeon RX 570 4GB
AMD Radeon RX 460 4GB (14 CU)
AMD Radeon RX 370 (2 GB)
NVIDIA GeForce GTX 1660 Ti
NVIDIA GeForce GTX 1660
NVIDIA GeForce GTX 1060 6GB Founders Edition (1260 cores)
NVIDIA GeForce GTX 1060 3GB (1152 cores)
NVIDIA GeForce GTX 1050 Ti (4GB)
NVIDIA GeForce GTX 1050 2GB
NVIDIA GeForce GTX 960 (2GB)
NVIDIA GeForce GTX 950
NVIDIA GeForce GTX 750 Ti
Video Drivers NVIDIA Release 430.39
AMD Radeon Software Adrenalin 2019 Edition 19.4.3
OS Windows 10 x64 Pro (1803)
Spectre and Meltdown Patched

Driver-wise, in addition to not being made available before launch, the 430.39 release was not the smoothest either, with a 430.59 hotfix coming out just this week to resolve bugs and performance issues. In our testing, we did observe some flickering in Ashes.

Meet the ZOTAC GeForce GTX 1650 OC Battlefield 1
Comments Locked

126 Comments

View All Comments

  • Gigaplex - Sunday, May 5, 2019 - link

    I spend more than that on lunch most days.
  • Yojimbo - Sunday, May 5, 2019 - link

    "I spend more than that on lunch most days."

    Economics is hard.
  • gglaw - Sunday, May 5, 2019 - link

    At least you went through and acknowledge how horribly wrong the math was so the entire initial premise is flawed. The $12.50 per year is also very high case scenario that would rarely fit a hardcore gamer who cares about TINY amounts of power savings. This is assuming 3 hours per day, 7 days a week never missing a day of gaming and that every single minute of this computer time is running the GPU at 100%. Even if you twist every number to match your claims it just doesn't pan out - period. The video cards being compared are not $25 difference. Energy conservative adults who care that much about every penny they spend on electricity don't game hardcore 21 hours a week. If you use realistic numbers of 2-3h game time 5 times a week and the fact that the GPU's are not constantly at 100% load and say a more realistic number like 75% of max power usage on average - this results in a value much below the $25 (which again is only half the price difference of the GPU's you're comparing). Using these more realistic numbers it's closer to $8 per year energy cost difference to own a superior card that results in better gaming quality for over a thousand hours. If saving $8 is that big a deal to you to have a lower gaming experience, then you're not really a gamer and probably don't care what card you're running. Just run a 2400G on 720p and low settings and call it a day. Playing the math game with blatantly wrong numbers doesn't validate the value of this card.
  • zodiacfml - Saturday, May 4, 2019 - link

    Right. My calculation is a bit higher with $ 0.12 per KWh but playing at 8 hours day, 365 days.
    I will take the rx570 and undervolt to reduce the consumption.
  • Yojimbo - Saturday, May 4, 2019 - link

    Yes good idea. The you can get the performance of the 1650 for just a few more watts than the 1650.
  • eddieobscurant - Sunday, May 5, 2019 - link

    No, it doesn't. It's about 25 dollars over a 2 year period , if you play for 8 hours/day, every day for 2 years. If you're gaming less , or just browsing the difference is way smaller.
  • spdragoo - Monday, May 6, 2019 - link

    Per my last bill, I pay $0.0769USD per kWh. So, spending $50USD means I've used 650.195056 kWh, or 650,195.056 Wh. Comparing the power usage at full, it looks like on average you save maybe 80W using the GTX 1650 vs. the RX 570 (75W at full power, 86W at idle, so call it 80W average). That means it takes me (650195.056 Wh / 80W) = 8,127.4382 hours of gaming to have "saved" that much power. In a 2-year period, assuming the average 365.25 days per year & 24 hours per day, there's a maximum available of 17,532 hours. The ratio, then, of the time needed to spend gaming vs. total elapsed time in order to "save" that much power is (8127.4382 / 17352) = 46.838625%...which equates to an average 11.24127 hours (call it 11 hours 15 minutes) of gaming ***per day***. Now, ***MAYBE*** if I a) didn't have to work (or the equivalent, i.e. school) Monday through Friday, b) didn't have some minimum time to be social (i.e. spending time with my spouse), c) didn't have to also take care of chores & errands (mowing the lawn, cleaning the house, grocery shopping, etc.), & d) take the time for other things that also interest me besides PC gaming (reading books, watching movies & TV shows, taking vacations, going to Origins & comic book conventions, etc.), & e) I have someone providing me a roof to live under/food to eat/money to spend on said games & PC, I ****MIGHT**** be able to handle that kind of gaming schedule...but I not only doubt that would happen, but I would probably get very bored & sick of gaming (PC or otherwise) in short order.

    Even someone who's more of an avid gamer & averages 4 hours of gaming per day, assuming their cost for electricity is the same as mine, will need to wait ***five to six years*** before they can say they saved $50USD on their electrical bill (or the cost of a single AAA game). But let's be honest; even avid gamers of that level are probably not going to be satisfied with a GTX 1650's performance (or even an RX 570's); they're going to want a 1070/1080/1080TI/2060/2070/2080 or similar GPU (depending on their other system specs). Or, the machine rocking the GTX 1650 is their ***secondary*** gaming PC...& since even that is going to set them back a few hundred dollars to build, I seriously doubt they're going to quibble about saving maybe $1 a month on their electrical bill.
  • Foeketijn - Tuesday, May 7, 2019 - link

    You need to game on average 4 hour per day to reach the 50 euro in two years.
    If gaming is that important to you, you might want to look at another video card.
  • Hixbot - Tuesday, May 7, 2019 - link

    I think performance per watt is an important metric to consider, not because of money saved on electricity but because of less heat dumped into my case.
  • nathanddrews - Friday, May 3, 2019 - link

    Yeah, sure seems like it. RX570s have been pretty regularly $120 (4GB) to $150 (8GB) for the last five months. I'm guessing we'll see a 1650SE with 3GB for $109 soon enough (but it won't be labeled as such)...

Log in

Don't have an account? Sign up now