A bit over 8 months after it all began, the tail-end of NVIDIA’s GeForce Turing product stack launch is finally in sight. This morning the company is rolling out its latest and cheapest GeForce Turing video card, the GeForce GTX 1650. Coming in at $149, the newest member of the GeForce family is set to bring up the rear of the GeForce product stack, offering NVIDIA’s latest architecture in a low-power, 1080p-with-compromises gaming video card with a budget-friendly price to match.

In very traditional NVIDIA fashion, the Turing launch has been a top-to-bottom affair. After launching the four RTX 20 series cards early in the cycle, NVIDIA’s efforts in the last two months have been focused on filling in the back end of their product stack. Central to this is a design variant of NVIDIA’s GPUs, the TU11x series – what I’ve been dubbing Turing Minor – which are intended to be smaller, easier to produce chips that retain the all-important core Turing architecture, but do away with the dedicated ray tracing (RT) cores and the AI-focused tensor cores as well. The end result of this bifurcation has been the GeForce GTX 16 series, which is designed to be a leaner and meaner set of Turing GeForce cards.

To date the GTX 16 series has been comprised of solely the GTX 1660 family of cards – the GTX 1660 (vanilla) and GTX 1660 Ti. Both of these were based on the TU116 GPU. However today the GTX 16 series family is expanding, with the introduction of the GTX 1650 and the new Turing GPU powering NVIDIA’s junior-sized card: TU117.


Unofficial TU117 Block Diagram

Whereas the GeForce GTX 1660 Ti and the underlying TU116 GPU served as our first glimpse at NVIDIA’s mainstream product plans, the GeForce GTX 1650 is a much more pedestrian affair. The TU117 GPU beneath it is for all practical purposes a smaller version of the TU116 GPU, retaining the same core Turing feature set, but with fewer resources all around. Altogether, coming from the TU116 NVIDIA has shaved off one-third of the CUDA cores, one-third of the memory channels, and one-third of the ROPs, leaving a GPU that’s smaller and easier to manufacture for this low-margin market. Still, at 200mm2 in size and housing 4.7B transistors, TU117 is by no means a simple chip. In fact, it’s exactly the same die size as GP106 – the GPU at the heart of the GeForce GTX 1060 series – so that should give you an idea of how performance and transistor counts have (slowly) cascaded down to cheaper products over the last few years.

At any rate, TU117 will be going into numerous NVIDIA products over time. But for now, things start with the GeForce GTX 1650.

NVIDIA GeForce Specification Comparison
  GTX 1650 GTX 1660 GTX 1050 Ti GTX 1050
CUDA Cores 896 1408 768 640
ROPs 32 48 32 32
Core Clock 1485MHz 1530MHz 1290MHz 1354MHz
Boost Clock 1665MHz 1785MHz 1392MHz 1455MHz
Memory Clock 8Gbps GDDR5 8Gbps GDDR5 7Gbps GDDR5 7Gbps GDDR5
Memory Bus Width 128-bit 192-bit 128-bit 128-bit
VRAM 4GB 6GB 4GB 2GB
Single Precision Perf. 3 TFLOPS 5 TFLOPS 2.1 TFLOPS 1.9 TFLOPS
TDP 75W 120W 75W 75W
GPU TU117
(200 mm2)
TU116
(284 mm2)
GP107
(132 mm2)
GP107
(132 mm2)
Transistor Count 4.7B 6.6B 3.3B 3.3B
Architecture Turing Turing Pascal Pascal
Manufacturing Process TSMC 12nm "FFN" TSMC 12nm "FFN" Samsung 14nm Samsung 14nm
Launch Date 4/23/2019 3/14/2019 10/25/2016 10/25/2016
Launch Price $149 $219 $139 $109

Right off the bat, it’s interesting to note that the GTX 1650 is not using a fully-enabled TU117 GPU. Relative to the full chip, the version that’s going into the GTX 1650 has had a TPC fused off, which means the chip loses 2 SMs/64 CUDA cores. The net result is that the GTX 1650 is a very rare case where NVIDIA doesn’t put their best foot forward right off the bat – the company is essentially sandbagging – which is a point I’ll loop back around to here in a bit.

Within NVIDIA’s historical product stack, it’s somewhat difficult to place the GTX 1650. Officially it’s the successor to the GTX 1050, which itself was a similar cut-down card. However the GTX 1050 also launched at $109, whereas the GTX 1650 launches at $149, a hefty 37% generation-over-generation price increase. Consequently, you could be excused if you thought the GTX 1650 felt a lot more like the GTX 1050 Ti’s successor, as the $149 price tag is very comparable to the GTX 1050 Ti’s $139 launch price. Either way, generation-over-generation, Turing cards have been more expensive than the Pascal cards they have replaced, and the low price of these budget cards really amplifies this difference.

Diving into the numbers then, the GTX 1650 ships with 896 CUDA cores enabled, spread over 2 GPCs. This is actually not all that big of a step up from the GeForce GTX 1050 series on paper, but Turing’s architectural changes and effective increase in graphics efficiency mean that the little card should pack a bit more of a punch than it first shows on paper. The CUDA cores themselves are clocked a bit lower than usual for a Turing card, however, with the reference-clocked GTX 1650 boosting to just 1665MHz.

Rounding out the package is 32 ROPs, which are part of the card’s 4 ROP/L2/Memory clusters. This means the card is being fed by a 128-bit memory bus, which NVIDIA has paired up with GDDR5 memory clocked at 8Gbps. Conveniently enough, this gives the card 128GB/sec of memory bandwidth, which is about 14% more than the last-generation GTX 1050 series cards got. Thankfully, while NVIDIA hasn’t done much to boost memory capacities on the other Turing cards, the same is not true for the GTX 1650: the minimum here is now 4GB, instead of the very constrained 2GB found on the GTX 1050. Not that 4GB is particularly spacious in 2019, however the card shouldn’t be quite so desperate for memory as its predecessor was.

Overall, on paper the GTX 1650 is set to deliver around 60% of the performance of the next card up in NVIDIA’s product stack, the GTX 1660. In practice I expect the two to be a little closer than that – GPU performance scaling isn’t quite 1-to-1 – but that is the ballpark area we’re looking at right now until we can actually test the card.

Meanwhile when it comes to power consumption, the smallest member of the GeForce Turing stack is also the lowest power. NVIDIA has held their GTX xx50 cards at 75W (or less) for a few generations now, and the GTX 1650 continues this trend. Which means that, at least for cards operating at NVIDIA’s reference clocks, an additional PCIe power connector is not necessary and the card can be powered solely off of the PCIe bus. This satisfies the need for a card that can be put in basic systems where a PCIe power cable isn’t available, or in low-power systems where a more power-hungry card isn’t appropriate. This also means that while discrete video cards aren’t quite as popular as they once were for HTPCs, for HTPC builders who are looking to go that route, the GTX 1650 is going to be the GTX 1050 series’ replacement in that market as well.

Reviews, Product Positioning, & The Competition

Shifting gears to business matters, let’s talk about product positioning and hardware availability.

The GeForce GTX 1650 is a hard launch for NVIDIA; that means that cards are shipping from retailers and in OEM systems starting today. Typical for low-end NVIDIA cards, there are no reference cards or reference designs to speak of, so NVIDIA’s board partners will be doing their own thing with their respective product lines. Notably, these will include factory overclocked cards that offer more performance, but also which will require an external PCIe power connector in order to meet the cards' greater energy needs.

Despite this being a hard launch however, in a very unorthodox (if not outright underhanded) move, NVIDIA has opted not to allow the press to test GTX 1650 cards ahead of time. Specifically, NVIDIA has withheld the driver necessary to test the card, which means that even if we had been able to secure a card in advance, we wouldn’t have been able to run it. We do have cards on the way and we’ll be putting together a review in due time, but for the moment we have no more hands-on experience with GTX 1650 cards than you, our readers, do.

NVIDIA has always treated low-end card launches as a lower-key affair than their high-end wares, and the GTX 1650 is no different. In fact this generation’s launch is particularly low-key: we have no pictures or even a press deck to work with, as NVIDIA opted to inform us of the card over email. And while there’s little need for extensive fanfare at this point – it’s a Turing card, and the Turing architecture/feature set has been covered to excess at this point – it’s rare that a card based on a new GPU launches without reviewers getting an early crack at it. And that’s for a good reason: reviewers offer neutral, third-party analysis of the card and its performance. So it’s generally not in buyers’ best interests to cut out reviewers – and when it is this can raise some red flags – but none the less here we are.

At any rate, while I’d suggest that buyers hold off for a week or so for reviews to be put together, Turing at this point is admittedly a known quantity. As we mentioned earlier the on-paper specifications put the GTX 1650 at around 60% of the GTX 1660’s performance, and real-world performance will probably be a bit higher. NVIDIA for their part is primarily pitching the card as an upgrade for the GeForce GTX 950 and its same-generation AMD counterparts, and this has been the same upgrade cadence gap we’ve seen throughout the rest of the GeForce Turing family. NVIDIA is saying that performance should be 2x (or more) faster than the GTX 950, and this is something that should be easily achieved.

While we’re waiting to get our hands on a card to run benchmarks, broadly speaking the GTX xx50 series of cards are meant to be 1080p-with-compromises cards, and I’m expecting much the same for the GTX 1650 based on what we saw with the GTX 1660. The GTX 1650 should be able to run some games at 1080p at maximum image quality – think DOTA2 and the like – but in more demanding games I expect it to have to drop back on some settings to stay at 1080p with playable framerates. One advantage that it does have here, however, is that with its 4GB of VRAM, it shouldn’t struggle nearly as much on more recent games as the 2GB GTX 950 and GTX 1050 do.

Strangely enough, NVIDIA is also offering a game bundle (of sorts) with the GTX 1650. Or rather, the company has extended their ongoing Fortnite bundle to cover the new card, along with the rest of the GeForce GTX 16 lineup. The bundle itself isn’t much to write home about – some game currency and skins for a game that’s free to begin with – but it’s an unexpected move since NVIDIA wasn’t offering this bundle on the other GTX 16 series cards when they launched.

Meanwhile, looking at the specs of the GTX 1650 and how NVIDIA has opted to price the card, it’s clear that NVIDIA is holding back a bit. Normally the company launches two low-end cards at the same time – a card based on a fully-enabled GPU and a cut-down card – which they haven’t done this time. This means that NVIDiA is sitting on the option of rolling out a fully-enabled TU117 card in the future if they want to. And while the actual CUDA core count differences between GTX 1650 and a theoretical GTX 1650 Ti are quite limited, to the point where a few more CUDA cores alone would probably not be worth it, NVIDIA also has another ace up its sleeve in the form of GDDR6 memory. If the conceptually similar GTX 1660 Ti is anything to go by, a fully-enabled TU117 card with a small bump in clockspeeds and 4GB of GDDR6 could probably pull far enough ahead of the vanilla GTX 1650 to justify a new card, perhaps at $179 or so to fill NVIDIA’s current product stack gap.

Finally, as for the competition, AMD of course is riding out the tail-end of the Polaris-based Radeon RX 500 series, so this is what the GTX 1650 will be up against. AMD is trying very hard to setup the Radeon RX 570 8GB against the GTX 1650, which makes for a very interesting battle. Based on what we saw with the GTX 1660, the RX 570 should perform rather well versus the GTX 1650, and the 8GB of VRAM would be the icing on the cake. However I’m not sure AMD and its partners can necessarily hold 8GB card prices to $149 or less, in which case the competition may end up being the 4GB RX 570 instead.

Ultimately AMD’s position is going to be that while they can’t match the GTX 1650 on features or power efficiency – and bear in mind that the RX 570 is rated to draw almost twice as much power here – they can match it on pricing and beat it on performance. Which as long as AMD wants to hold the line there, this is going to be a favorable matchup for AMD on a pure price/performance basis for current-generation games. Though to see how favorable it might be, we’ll of course need to benchmark the GTX 1650, so be sure to stay tuned for that.

Q2 2019 GPU Pricing Comparison
AMD Price NVIDIA
  $349 GeForce RTX 2060
Radeon RX Vega 56 $279 GeForce GTX 1660 Ti
Radeon RX 590 $219 GeForce GTX 1660
Radeon RX 580 (8GB) $189 GeForce GTX 1060 3GB
(1152 cores)
Radeon RX 570 $149 GeForce GTX 1650
POST A COMMENT

55 Comments

View All Comments

  • BigMamaInHouse - Tuesday, April 23, 2019 - link

    Saw links to the driver listed already: Version: 430.39 WHQL.
    https://www.nvidia.com/Download/driverResults.aspx...
    Reply
  • jeremyshaw - Tuesday, April 23, 2019 - link

    Reading the release notes, this is the new driver stack (430). No more 3D Vision & no more Mobile Kepler support. 356MB, a bit smaller than the 550MB+ drivers previously offered. Reply
  • LiquidSilverZ - Tuesday, April 23, 2019 - link

    "356MB, a bit smaller than the 550MB+ drivers previously offered"
    The driver download page lists the file (430.39) as 356MB, however once you click the download link, it comes up as 537MB actual download. Very frustrating.
    I have to pay for bandwidth and sure miss the old style releases that had English (which were small file size downloads) and International as two separate choices. These international only downloads are bloated and a waste of bandwidth.
    Reply
  • ballsystemlord - Saturday, April 27, 2019 - link

    Maybe you don't know, but tools like curl (curl -I URL (-I as in the letter i not the letter L)), can issue a head request and that will tell you the files length at only the cost of a few bytes set both ways. Reply
  • jeremyshaw - Tuesday, April 23, 2019 - link

    I suppose this brings up a few questions: is this the end of fabbing at Samsung for Nvidia? How close to 75W is this card? I remember the GTX1050ti having a fair bit of headroom on that 75W claim. Is there even a possibility for a passive card? They seem to be getting fewer in number, every generation. Reply
  • PeachNCream - Tuesday, April 23, 2019 - link

    Some of those OEM boards are huge and feature obnoxious (possibly unnecessary) multi-slot coolers. Given the TDP, I'm betting the card could operate very well with lots less fan and radiation surface, but OEM perception of gamer expectation is probably the onus for the huge coolers. Maybe there are cost savings in using relatively standardized, giant double slot solutions that were already developed for more demanding GPUs. Reply
  • jeremyshaw - Tuesday, April 23, 2019 - link

    LOL, I just saw EVGA's late April Fools joke. Triple slot, dual fan, GTX1650 card, with PCIe 6 pin, and stock clocks. Reply
  • marsdeat - Tuesday, April 23, 2019 - link

    I thought you were JOKING, but had to go check it out. They even list the power at 75W! What on earth are they DOING?! Reply
  • BenSkywalker - Tuesday, April 23, 2019 - link

    Overclocking.

    If the board draws 75watts at default clocks and lacks an external power connector than you are pretty much hit when it comes to overclocking(within a fairly small variance). If you are an enthusiast considering this after reviews hit, an external power connector should be on the top of your list(unless you are looking for a quiet sff type setup).
    Reply
  • PeachNCream - Tuesday, April 23, 2019 - link

    I wouldn't argue that a 75W card should get an external power connector for overclocking, but the triple slot cooler with two fans is overkill even if the OC pushes the card up another 25-30W. Reply

Log in

Don't have an account? Sign up now