Conclusion

For anyone buying a new system today, the market is a little bleak. Anyone wanting a new GPU has to actively pay attention to stock levels, or drive to a local store for when a delivery arrives. The casual buyers then either look to pre-built systems (which are also flying off the shelves), or just hang on to what they have for another year.

But there is another way. I find that users fall in to two camps.

The first camp is the ‘upgrade everything at once’ attitude. These users sell their old systems and buy, mostly, all anew. Depending on budget and savings, this is probably a good/average system, and it means you get a good run of what’s available at that time. It’s a multi-year upgrade cycle where you might get something good for that generation, and hopefully everything is balanced.

The other camp is the ‘upgrade one piece at a time’. This means that if it’s time to upgrade a storage drive, or a memory kit, or a GPU, or a CPU, you get the best you can afford at that time. So you might end up with an older CPU but a top end GPU, good storage, good power supply, and then next time around, it’s all about CPU and motherboard upgrades. This attitude has the potential for more bottlenecks, but it means you often get the best of a generation, and each piece holds its resale value more.

In a time where we have limited GPUs available, I can very much see users going all out on the CPU/memory side of the equation, perhaps spending a bit extra on the CPU, while they wait for the graphics market to come back into play. After all, who really wants to pay $1300 for an RTX 3070 right now?

Performance and Analysis

In our Core i7-11700K review, our conclusions there are very much broadly applicable here. Intel’s Rocket Lake as a backported processor design has worked, but has critical issues with efficiency and peak power draw. Compared to the previous generation, clock-for-clock performance gains for math workloads are 16-22% or 6-18% for other workloads, however the loss of two cores really does restrict how much of a halo product it can be in light of what AMD is offering.

Rocket Lake makes good in offering PCIe 4.0, and enabling new features like Gear ratios for the memory controller, as well as pushing for more support for 2.5 gigabit Ethernet, however it becomes a tough sell. At the time we reviewed the Core i7-11700K, we didn’t know the pricing, and it was looking like AMD’s stock levels were pretty bad, subsequently making Intel the default choice. Since then, Intel's pricing hasn't turned out too bad for its performance compared to AMD (except for the Core i9), however AMD’s stock is a lot more bountiful.

For anyone looking at the financials for Intel, the new processor is 25% bigger than before, but not being sold for as big a margin as you might expect. In some discussions in the industry, it looks like retailers are getting roughly 20%/80% stock for Core i9 to Core i7, indicating that Intel is going to be very focused on that Core i7 market around $400-$450. In that space, AMD and Intel both have well-performing products, however AMD gets an overall small lead and is much more efficient.

However, with the GPU market being so terrible, users could jump an extra $100 and get 50% more AMD cores. When AMD is in stock, Intel’s Rocket Lake is more about the platform than the processor. If I said that that the Rocket Lake LGA1200 platform had no upgrade potential, for users buying in today, an obvious response might be that neither does AM4, and you’d be correct. However, for any user buying a Core i7-11700K on an LGA1200 today, compared to a Ryzen 7 5800X customer on AM4, the latter still has the opportunity to go to 16 cores if needed. Rocket Lake comes across with a lot of dead-ends in that regard, especially as the next generation is meant to be on a new socket, and with supposedly new memory.

Rocket Lake: Failed Experiment, or Good Attempt?

For Intel, Rocket Lake is a dual purpose design. On the one hand, it provides Intel with something to put into its desktop processor roadmap while the manufacturing side of the business is still getting sorted. On the other hand it gives Intel a good marker in the sand for what it means to backport a processor.

Rocket Lake, in the context of backporting, has been a ‘good attempt’ – good enough to at least launch into the market. It does offer performance gains in several key areas, and does bring AVX-512 to the consumer market, albeit at the expense of power. However in a lot of use cases that people are enabling today, which aren’t AVX-512 enabled, there’s more performance to be had with older processors, or the competition. Rocket Lake also gets you PCIe 4.0, however users might feel that is a small add-in when AMD has PCIe 4.0, lower power, and better general performance for the same price.

Intel’s future is going to be full of processor cores built for multiple process nodes. What makes Rocket Lake different is that when the core was designed for 10nm, it was solely designed for 10nm, and no thought was ever given to a 14nm version. The results in this review show that this sort of backporting doesn’t really work, not to the same level of die size, performance, and profit margin needed to move forward. It was a laudable experiment, but in the future, Intel will need to co-design with multiple process nodes in mind.

Gaming Tests: Strange Brigade
Comments Locked

279 Comments

View All Comments

  • blppt - Tuesday, March 30, 2021 - link

    I disagree. I had a 9590 (which shipped WITH a small AIO cooler!) and the thing was shaky at best for stability, easily topping 90c at stock settings.

    Not the mobo fault either, I had the top end ASUS CHVF-Z 990FX, which was such a mature chipset it practically had grey hairs.
  • TheinsanegamerN - Wednesday, March 31, 2021 - link

    the 9000 series all had stability issues. Backing off 1 clock bin or tinkering with voltage would usually fix them.

    Bulldozer didnt have the thermal density issues modern CPUs have. If you had the cooling, it would work. Bulldozer's issue was the sheer amount of heat being being generated would overwhelm many CPU coolers of the time, which were built aroudn the more tradiitonal ~100w power draw of intel I7s and the ~125-140 of phenoms. The 200+ that bulldozer was pulling was new territory.
  • Oxford Guy - Wednesday, March 31, 2021 - link

    Certain motherboard makers played loose with the VRMs. AsRock in particular was known for its 9000-series-certified boards frying. MSI was also bad. Only a few boards were suited to the 9000 series and any enthusiast would have skipped the 9000 series in favor of one of the lower-leakage chips, which could be overclocked to the same 4.7 GHz. 5 GHz with Piledriver was not stable, requiring too much voltage. ASUS tried to hide that by under-reporting the voltage used in its flagship board. 4.4 GHz was optimal, 4.5 was okay, and 4.7 was as far as one wanted to go for frequent use. That's with the lower-leakage 'E' parts.

    "The Stilt" said AMD would have sent the 9000 series to the crusher had it not come up with an after-the-fact lower standard for leakage. So, Hruska gets his take spectacularly wrong in his Rocket Lake article. The 9000 series was not aimed at 'the enthusiast faithful'. Those people knew better than to buy a 9000 series chip, even though there were a few astroturfers trying to get people to buy them — like one guy who claimed his was running at 5.1 GHz 24/7.

    It was aimed at people who could be tricked by the 5 Ghz number. It was the most cynical cash grab possible. Not only did AMD offer only 4 FPU cores (important for gaming) it offered a CPU that was priced into the stratosphere while having un-fixable single-core performance.

    Piledriver's fatal flaw was its abysmal single-thread performance, not its power consumption. It could have been okay enough with the lower-leakage standard (and a more strict socket standard as Zen 1 had). But, reportedly, the 32nm SOI wasn't very good for some time (Bulldozer and the first generation of Piledriver), so AMD let the AM3+ spec be pretty loose (although not as loose as FM).

    Overclocking Piledriver even to 5 GHz wasn't enough to give it decent single-thread performance.

    I do have to agree that the 9590 was the single worst consumer CPU product ever released. It even edges out the Pentium III that wasn't stable — since that one was actually pulled from the market. Not only was the 9590 100% cynical exploitation of consumer ignorance, it was really bad technologically. Figures that Hruska would praise it.

    (If, though, one lived in Iceland with a solar array backed by an iron-nickle battery complex, the 9590 would have been okay for playing Deserts of Kharak, provided one didn't buy it at its original price.)
  • blppt - Thursday, April 1, 2021 - link

    "Those people knew better than to buy a 9000 series chip, even though there were a few astroturfers trying to get people to buy them — like one guy who claimed his was running at 5.1 GHz 24/7."

    What is especially sad here is that even IF he managed to pump the 250-300W into that 9590 to run at 5.1 (all cores), it was probably still slower than a 4790K at stock speeds.
  • Oxford Guy - Saturday, April 3, 2021 - link

    In single core, certainly. However, 2011 is stamped onto the spreaders of Piledriver and it hit the market in 2012. The 4790K hit the market in Q2 2014.

    In 2014, the only FX to consider was the 8320E. Not only was it cheap (at least at MicroCenter), it could run in any AM3+ board without killing it — and could be overclocked better than a 9000 series with anything below nitrogen, due to its much superior leakage.

    The 8320E was the only FX worth anyone’s time. Paired with a UD3P board it could do 4.4 GHz readily and could manage 4.7 with a fast fan angled at the VRM sink. Total cost was very low for the CPU and board from MicroCenter, which is why I recommended that setup to the tightest budget people. But, the bad single core was a problem for frametime consistency.

    AMD should have been publicly tarred and feathered by the tech press for the 9590. All the light mockery wasn’t enough.
  • Spunjji - Friday, April 9, 2021 - link

    Broadly agreed, but I'd note that the 6300 was also reasonable if you were on a painfully low budget. I suggested it to a friend (his alternative was a Sandy Bridge i3) and it lasted him until a year back as his main gaming system. It's now moved on to another friend, who still uses it for games. Those chips have aged surprisingly well, all things considered, though it is probably holding his RX 470 back a little bit.
  • Oxford Guy - Wednesday, March 31, 2021 - link

    • The 9590 posted the highest results in the game Deserts of Kharak, in a dual 980 Ti setup at only 1080 or 1440. And, SLI setups showed competitive 4K scores for many games back then.

    • The overclocked 'The Stilt' said the 9000 series is not the chip to judge the design by because it has the worst leakage characteristics and would have been sent to the crusher had AMD not decided to create a lower standard after the fact. Instead, the chips that should be used to represent Piledriver are the 'E' series. They have the lowest leakage and can manage the same 4.7 GHz the 9590 uses with much more reasonable (although still non-competitive) demands. The 9000 series was really AMD's gift to Intel, by making the bad ancient Piledriver design look much worse.

    • AMD was a small cash-strapped company, thanks to Intel's monopoly abuses. When AMD was leading the x86 industry Intel kept it from getting the profit. So, Piledriver, although very bad in a number of ways, will never be as bad as Rocket Lake. The 9000 series is the only exception, though, since it was a purely cynical cash grab by AMD, using '5 GHz' to sucker people.
  • blppt - Thursday, April 1, 2021 - link

    "The 9590 posted the highest results in the game Deserts of Kharak, in a dual 980 Ti setup at only 1080 or 1440. And, SLI setups showed competitive 4K scores for many games back then."

    As I stated, in the (exceedingly rare) case where a game or app can saturate all 8 cores, when the 9590 was in its prime, it could be competitive.

    That almost never happened, especially in games. About the only 2 I can think of offhand that could do that in the 9590's prime was GTA5 and Company of Heroes 2. And even then, you were using 150+ more watts to get the same or slightly better performance than Intel's high-end quad cores. Along with the required AIO water cooling and required high-end mobo with a beastly VRM setup. As far as I know, only 3 pricey mobos were approved for the 9590, my CHVF-Z, one Gigabyte board, and an ASRock.

    9590 was one of the worst cpus ever. Probably the single worst (special edition) cpu. I had one for years.

    This rocket lake, while disappointing, hot, and power consuming, is consistently competitive in every game versus its direct competitors. The 9590 cannot come close to saying that.
  • Oxford Guy - Saturday, April 3, 2021 - link

    I cite Desert of Kharak because it’s the only game I’ve seen put the FX ahead of Intel at below 4K.

    Not only would the game need to be able to leverage 8 integer cores without needing more than 4 FPU cores, it would have to be able to saturate a narrow deep pipeline and not rely heavily on single thread IPC. It should also scale with clock and not need the best RAM and L3 performance. RTS is probably the best genre for the Piledriver design.
  • Gondalf - Tuesday, March 30, 2021 - link

    AMD FX-9590 had not AVX-512. Very high performance have a cost.
    Try to image Zen 3 with AVX-512, it could not be a champion in low power consumption at all.

    If you do not like high power draw, simply disable AVX-512.

Log in

Don't have an account? Sign up now