Gaming: F1 2018

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters' EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

We use the in-game benchmark, set to run on the Montreal track in the wet, driving as Lewis Hamilton from last place on the grid. Data is taken over a one-lap race.

AnandTech CPU Gaming 2019 Game List
Game Genre Release Date API IGP Low Med High
F1 2018 Racing Aug
2018
DX11 720p
Low
1080p
Med
4K
High
4K
Ultra

All of our benchmark results can also be found in our benchmark engine, Bench.

Game IGP Low Medium High
Average FPS
95th Percentile

Gaming: Shadow of the Tomb Raider (DX12) Power Consumption
Comments Locked

143 Comments

View All Comments

  • Stasinek - Wednesday, November 21, 2018 - link

    It's indeed suprise to me that those new 24,32C AMD processors 2920,2970 are just worst in any therms than their 16C equivalents. In terms of perf/money perf/power just laughable.
    Linux changes a lot but who uses Linux and for what purpose?
    I bet developers but what makes me really angry is that nobody even tries to use KVM, Xen, VirtualBox, VMware, VirtualBox as benchmarking tool for purpose of testing usage as small company server. In mine company lot of Remote Desktop sessions are connected to same server.
    Someone would think - who needs good CPU? But it's only because dont used to solve real life problems and those problems are like importing big databases from obsolete programs, filtering, fixing and exporting to new ERP systems. This consumes lot of time to have fast CPU is crucial. Most of companies i know uses RDP server for that purpose and typical cheep portable laptops given to workers. To have AMD or Intel HDET tested in such purposes would be nice to see. Cause anyone can potencially have 32C anyone could benefit.. but rather than this kind test i used to see gaming.. gaming od HDET?! WTF
  • pandemonium - Wednesday, November 14, 2018 - link

    All of these "my work doesn't have any desktop users" comments crack me up. Congratulations. Your work is not the entire world of computing in a professional space, much less prosumer space. Get over yourselves.
  • halcyon - Wednesday, November 14, 2018 - link

    @Ian Cutress
    Your tests and review text are always a pleasure to read, thank you for the professionalism.

    Questions related to the test suite (I know, everybody always wants something):

    1. You are missing an Excel Spreadsheet calculation (Finance still uses a lot of these and they can peg all cores near 100% and be incredibly CPU dependent). Would be nice to see some for example an Excel Monte Carlo simulation kn the suite (local data)

    2. Alternatively an R (language) based test for heavy statistical computation. Finding a one that is representative of real world workloads and strikes a balance between single core IPC and many core parallelisation might take some work. But this is one area where laptops just can't muster it and CUDA/OpenCL acceleration isn't often available at all.

    3. For Web / JS framework it is nice to see SpeedoMeter and WebXPRT3, but for some reason V8 Web Tooling Bench is not there (https://v8.github.io/web-tooling-benchmark/ ). The old Kraken/JetStream/Octane are nice for reference, but not very representative of real world anymore for some time now (hence why they are abandoned).

    Again thank you for this monumental work, the amount of tests is already superb!

    For graphing results it would be so helpful to get a comparative price/perf graphed results browser (pick your baseline CPU, pick workloads, cpus on graph as a func of price/perf). This would enable auick viewinf of the efficient frontier in terms of price/perf for each workload and see the base CPU as an anchor.

    Yeah, yeah, I know.... Just throwing this in here 😀
  • KAlmquist - Wednesday, November 14, 2018 - link

    These benchmarks also show the 16 core TR 2950X beating the 18 core i9-9980XE in some cases.
  • KAlmquist - Wednesday, November 14, 2018 - link

    My previous comment was a reply to nexuspie's observation that, "These benchmarks show that the 9980's 18 cores often BEAT the 2990wx's 32 cores."
  • Stasinek - Wednesday, November 21, 2018 - link

    Witch should lead to conlcusion AMD Threadripper 2 is just bad offer except 2950.
    It's the one and only AMD CPU worh mentioning - witch means TR4 16C is dead end.
    AMD offers overpriced CPUs on that platform that is for sure.
    Overpriced because half of cores are choking being absolute useless.
    If 32C and 4 channels is too much cores/channel imagine RYEN 3 16C on dual channel..
    It will be big dissapointment for some people i bet.
    Regardless of pricing Intel 9980 is just great.
  • Stasinek - Wednesday, November 21, 2018 - link

    Is that what you wanted to say?
  • crotach - Wednesday, November 14, 2018 - link

    I have to say I'm a big fan of HEDT platforms, I built my last workstation in 2011 and it still serves me well 7 years later. But looking at this and the X299 offering I really don't see why anyone would bother.
  • Lolimaster - Thursday, November 15, 2018 - link

    Till intel changes the way it builds high core count cpu's they can't compete with AMD and it will be even worse next year when AMD made an already cheaper way to produce high core count cpu's even cheaper, to sick levels.
  • Gasaraki88 - Thursday, November 15, 2018 - link

    I'm actually more interested in the i7-9800X vs. the i9-9900K. I want to see how the overclocking is compared to the i9-9900K before I just in to X299.

Log in

Don't have an account? Sign up now