Gaming: F1 2018

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters' EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

We use the in-game benchmark, set to run on the Montreal track in the wet, driving as Lewis Hamilton from last place on the grid. Data is taken over a one-lap race.

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile

 

Gaming: Far Cry 5 Conclusion: You Will Have To Be Quick
Comments Locked

235 Comments

View All Comments

  • Agent Smith - Friday, November 1, 2019 - link

    Only one year warranty with this CPU, reduced from 3yrs. So it’s marginally faster, uses more power, offers no gaming advantages and it’s price hike doesn’t justify the performance gain and warranty disadvantage over 9900k.

    ... and the 3950x is about to arrive. Mmm?
  • willis936 - Friday, November 1, 2019 - link

    Counter strike really needs to be added to benchmarks. It’s just silly how useless these gaming benchmarks are. There is virtually nothing that separates any of the processors. How can you recommend it for gaming when your data shows that a processor half the price is just as good? Test the real scenarios that people would want to use this chip.
  • Xyler94 - Friday, November 1, 2019 - link

    It's more because you need a specific set of circumstances these days to see the difference in gaming that's more than margin of error.

    You need at least a 2080, but preferably a 2080ti
    You need absolutely nothing else running on the computer other than OS, Game and launcher
    You need the resolution to be set at 1080p
    You need the quality to be at medium to high.

    then, you can see differences. CS:GO shows nice differences... but there's no monitor in the world that can display 400 to 500FPS, so yeah... Anandtech still uses a 1080, which is hardly taxing to any modern CPU, that's why you see no differences.
  • willis936 - Friday, November 1, 2019 - link

    csgo is a proper use case. It isn’t intense, graphically, and people regularly play with 1440p120. Shaving milliseconds off input to display latency matters. I won’t go into an in depth analysis to why, but imagine a human response time has a gaussian distribution and whoever responds first wins. Even if the mean response time is 150 ms, if the standard deviation is 20 ms and your input to display latency is 50 ms then there are gains to cutting 20, 10, even 5 ms off of it.

    And yes, more fps does reduce input latency, even in cases where the monitor refresh rate is lower than the fps.

    https://youtu.be/hjWSRTYV8e0
  • Xyler94 - Tuesday, November 5, 2019 - link

    If you visually can't react fast enough, doesn't matter how quickly the game can take an input, you're still limited on the information presented to you. 240hz is the fastest you can go, and 400FPS vs 450FPS isn't gonna win you tournaments.

    CS:GO is not a valid test, as there's more to gaming than FPS. Input lag is more about the drivers and peripherals, and there's even lag between your monitor and GPU to consider. But go on, pretend 50FPS at 400+ makes that huge of a difference.
  • solnyshok - Friday, November 1, 2019 - link

    No matter what GHz, buying a 14nm/PCIE3 chip/mobo just before 10nm/PCIE4 comes to the market... Seriously? Wait another 6 months.
  • mattkiss - Friday, November 1, 2019 - link

    10nm/PCIe 4 isn't coming to desktop next year, where did you hear that?
  • eek2121 - Friday, November 1, 2019 - link

    The 3700X is totally trolling Intel right now.
  • RoboMurloc - Friday, November 1, 2019 - link

    I dunno if anyone mentioned yet, but the KS has additional security measures to mitigate exploits which are probably causing the performance regressions.
  • PeachNCream - Friday, November 1, 2019 - link

    I expect I will never own an i9-9900KS or a Ryzen 7 3700X, but it is interesting to see how close AMD's 65W 8 core chip gets to Intel's 127+W special edition CPU in terms of performance in most of these benchmarks.

Log in

Don't have an account? Sign up now