Synthetics

Moving on to synthetic performance testing, there shouldn’t be any surprises here. This is the 4th TU116 card we’ve looked at – albeit the first with just a 128-bit memory bus – so its behavior is generally well understood.

Synthetic: TessMark - Image Set 4 - 64x Tessellation

Synthetic: Beyond3D Suite - Pixel Fillrate

Synthetic: Beyond3D Suite - Integer Texture Fillrate (INT8)

Synthetic: Beyond3D Suite - Floating Point Texture Fillrate (FP32)

Synthetic: Beyond3D Suite - INT8 Buffer Compression

Synthetic: Beyond3D Suite - FP32 Buffer Compression

With synthetic performance, we see the GTX 1650 Super benefitting significantly with tessellation throughput, most likely due to the inclusion of more GPCs to house the greater SM count. This still leaves it behind the GTX 1660 cards, but relatively close overall. Meanwhile we see just a slight uptick in pixel throughput, reflecting the fact that while the card has a good deal more memory bandwidth than the regular GTX 1650, for this simple (and generally well compressible) test, it doesn’t add much.

Compute Power, Temperature, & Noise
POST A COMMENT

67 Comments

View All Comments

  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem. Reply
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one. Reply
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions. Reply
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
    Reply
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already. Reply
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ??? Reply
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector Reply
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO. Reply
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses.. Reply
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.
    Reply

Log in

Don't have an account? Sign up now