Test Bed and Setup

As per our processor testing policy, we take a premium category motherboard suitable for the socket, and equip the system with a suitable amount of memory running at the manufacturer's maximum supported frequency. This is also typically run at JEDEC subtimings where possible.

It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date.

Test Setup
Intel 9th Gen i9-9900K
i7-9700K
i5-9600K
ASRock Z370
Gaming i7**
P1.70 TRUE
Copper
Crucial Ballistix
4x8GB
DDR4-2666
Intel 8th Gen i7-8086K
i7-8700K
i5-8600K
ASRock Z370
Gaming i7
P1.70 TRUE
Copper
Crucial Ballistix
4x8GB
DDR4-2666
Intel 7th Gen i7-7700K
i5-7600K
GIGABYTE X170
ECC Extreme
F21e Silverstone*
AR10-115XS
G.Skill RipjawsV
2x16GB
DDR4-2400
Intel 6th Gen i7-6700K
i5-6600K
GIGABYTE X170
ECC Extreme
F21e Silverstone*
AR10-115XS
G.Skill RipjawsV
2x16GB
DDR4-22133
Intel HEDT i9-7900X
i7-7820X
i7-7800X
ASRock X299
OC Formula
P1.40 TRUE
Copper
Crucial Ballistix
4x8GB
DDR4-2666
AMD 2000 R7 2700X
R5 2600X
R5 2500X
ASRock X370
Gaming K4
P4.80 Wraith Max* G.Skill SniperX
2x8 GB
DDR4-2933
AMD 1000 R7 1800X ASRock X370
Gaming K4
P4.80 Wraith Max* G.Skill SniperX
2x8 GB
DDR4-2666
AMD TR4 TR 1920X ASUS ROG
X399 Zenith
0078 Enermax
Liqtech TR4
G.Skill FlareX
4x8GB
DDR4-2666
GPU Sapphire RX 460 2GB (CPU Tests)
MSI GTX 1080 Gaming 8G (Gaming Tests)
PSU Corsair AX860i
Corsair AX1200i
SSD Crucial MX200 1TB
OS Windows 10 x64 RS3 1709
Spectre and Meltdown Patched
*VRM Supplimented with SST-FHP141-VF 173 CFM fans
** After Initial testing with the ASRock Z370 motherboard, we noted it had a voltage issue with the Core 9th Gen processors. As a result, we moved to the MSI MPG Z390 Gaming Edge AC for our power measurements. Benchmarking seems unaffected.

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.

Hardware Providers
Sapphire RX 460 Nitro MSI GTX 1080 Gaming X OC Crucial MX200 +
MX500 SSDs
Corsair AX860i +
AX1200i PSUs
G.Skill RipjawsV,
SniperX, FlareX
Crucial Ballistix
DDR4
Silverstone
Coolers
Silverstone
Fans
Spectre, Meltdown, STIM, and Z390 Our New CPU Testing Suite for 2018 and 2019
Comments Locked

274 Comments

View All Comments

  • 3dGfx - Friday, October 19, 2018 - link

    game developers like to build and test on the same machine
  • mr_tawan - Saturday, October 20, 2018 - link

    > game developers like to build and test on the same machine

    Oh I thought they use remote debugging.
  • 12345 - Wednesday, March 27, 2019 - link

    Only thing I can think of as a gaming use for those would be to pass through a gpu each to several VMs.
  • close - Saturday, October 20, 2018 - link

    @Ryan, "There’s no way around it, in almost every scenario it was either top or within variance of being the best processor in every test (except Ashes at 4K). Intel has built the world’s best gaming processor (again)."

    Am I reading the iGPU page wrong? The occasional 100+% handicap does not seem to be "within variance".
  • daxpax - Saturday, October 20, 2018 - link

    if you noticed 2700x is faster in half benchmarks for games but they didnt include it
  • nathanddrews - Friday, October 19, 2018 - link

    That wasn't a negative critique of the review, just the opposite in fact: from the selection of benchmarks you provided, it is EASY to see that given more GPU power, the new Intel chips will clearly outperform AMD most of the time - generally with average, but specifically minimum frames. From where I'm sitting - 3570K+1080Ti - I think I could save a lot of money by getting a 2600X/2700X OC setup and not miss out on too many fpses.
  • philehidiot - Friday, October 19, 2018 - link

    I think anyone with any sense (and the constraints of a budget / missus) will be stupid to buy this CPU for gaming. The sensible thing to do is to buy the AMD chip that provides 99% of the gaming performance for half the price (even better value when you factor in the mobo) and then to plough that money into a better GPU, more RAM and / or a better SSD. The savings from the CPU alone will allow you to invest a useful amount more into ALL of those areas. There are people who do need a chip like this but they are not gamers. Intel are pushing hard with both the limitations of their tech (see: stupid temperatures) and their marketing BS (see: outright lies) because they know they're currently being held by the short and curlies. My 4 year old i5 may well score within 90% of these gaming benchmarks because the limitation in gaming these days is the GPU. Sorry, Intel, wrong market to aim at.
  • imaheadcase - Saturday, October 20, 2018 - link

    I like how you said limitations in tech and point to temps, like any gamer cares about that. Every game wants raw performance, and the fact remains intel systems are still easier to go about it. The reason is simple, most gamers will upgrade from another intel system and use lots of parts from it that work with current generation stuff.

    Its like the whole Gsync vs non gsync. Its a stupid arguement, its not a tax on gsync when you are buying the best monitor anyways.
  • philehidiot - Saturday, October 20, 2018 - link

    Those limitations affect overclocking and therefore available performance. Which is hardly different to much cheaper chips. You're right about upgrading though.
  • emn13 - Saturday, October 20, 2018 - link

    The AVX 512 numbers look suspicious. Both common sense and other examples online suggest that AVX512 should improve performance by much less than a factor 2. Additionally, AVX-512 causes varying amounts of frequency throttling; so you;re not going to get the full factor 2.

    This suggests to me that your baseline is somehow misleading. Are you comparing AVX512 to ancient SSE? To no vectorization at all? Something's not right there.

Log in

Don't have an account? Sign up now