Discrete GPU Gaming Tests

4K Minimum with RTX 2080 Ti

By contrast to 1080p Max, our 4K Minimum testing aims at finding differences between CPUs at a playable resolution. There is still pixels to churn, but the 2080 Ti should at least be hitting 60 FPS in most games here.

A full list of results at various resolutions and settings can be found in our Benchmark Database.

(a-3) Chernobylite - 4K Low - Average FPS

No real change in Chernobylite.

(b-5) Civilization VI - 4K Min - Average FPS(b-6) Civilization VI - 4K Min - 95th Percentile

Civilization 6 gets a smaller increase in performance here than the 1080p Maximum test, but there's still a benefit over the previous generation. That being said, the AMD desktop CPUs with more cache pull ahead a lot here.

(c-5) Deus Ex MD - 4K Min - Average FPS(c-6) Deus Ex MD - 4K Min - 95th Percentile

Deus Ex seems to come to an asymptotic limit, and while the 4000G APUs were behind the curve, the 5000G APUs are solidly there.

(e-5) Final Fantasy 15 - 4K Standard - Average FPS

All CPUs pretty much hit a limit in FF15 above and Far Cry 5 below.

(i-5) Far Cry 5 - 4K Low - Average FPS(i-6) Far Cry 5 - 4K Low - 95th Percentile

(k-5) Grand Theft Auto V - 4K Low - Average FPS(k-6) Grand Theft Auto V - 4K Low - 95th Percentile

GTA 5 hits an odd glitchy mess around 180 FPS, and the new 5000G CPUs can push the RTX 2080 Ti in that direction a bit further - at this point it's probably best to start cranking up some detail to avoid it.

Discrete GPU Gaming Tests: 1080p Max with RTX 2080 Ti Integrated Graphics Tests: Finding 60 FPS
Comments Locked

135 Comments

View All Comments

  • GeoffreyA - Friday, August 13, 2021 - link

    SVC sounds pretty interesting, and the idea of layered encoding is becoming more common in today's codecs. I'd expect that H.265/6 and AV1 took it further. At least in their standards. Implementation is another story. Also reminds me of HE-AAC's extensions, the spectral band stuff and parametric stereo, which are just ignored by older decoders.
  • GeoffreyA - Friday, August 13, 2021 - link

    Even for files played on a TV, you've got to follow the profiles and levels, otherwise it just doesn't work. Our Samsung TV, I think I'm limited to H.264 profile high, level 4.1. Lamentably, it doesn't support H.265, which means re-encoding.
  • GeoffreyA - Thursday, August 5, 2021 - link

    Goodness, this is the one I've been waiting for! Thanks, Ian.
  • GeoffreyA - Thursday, August 5, 2021 - link

    Solid CPU performance as expected, but a bit disappointing in the GPU department, and pricing could be better. The 5300G looks like an impressive little fellow as well, perhaps even the star of today's show.
  • yankeeDDL - Thursday, August 5, 2021 - link

    Every time I see the peak-power charts it amazes me how can anyone considers Intel these days.
    28W burning 51W
    125W burning 277W; and slower than the 5800X @ 140W.
  • Spunjji - Thursday, August 5, 2021 - link

    It's always worth noting that the peak power on Intel chips is pretty pathological, and they tend to not go quite so high under ordinary loads. They're still lousy compared to AMD on perf/W, but they're a little better than 50% of AMD.
  • Makste - Thursday, August 5, 2021 - link

    Efficient tools. A 65W 5700G is an 11700K at 125W
  • abufrejoval - Thursday, August 5, 2021 - link

    The price of novelty, I guess: first day of official sales, 5700G ride €70 above 5800X, nicely filling the gap to the 5900X :-)
  • chazzzer - Thursday, August 5, 2021 - link

    The 5700G is available today at MSRP on AMD's website.
  • Spunjji - Thursday, August 5, 2021 - link

    Is it possible that a follow-up could be done assessing the performance impact of RAM speeds? I'd really love to know whether it's possible to get more out of the 5300G and the 5700G in particular - potentially enough to get 60fps at 1080p with some moderate sacrifices to detail settings.

Log in

Don't have an account? Sign up now