POST A COMMENT

258 Comments

Back to Article

  • jjj - Monday, June 19, 2017 - link

    The 10 cores die is clearly 320+mm2 not 308mm2. The 308mm figure rounds down the mm based on those GamerNexus pics. From there, you slightly underestimate the size of other 2 die. Reply
  • Sarah Terra - Monday, June 19, 2017 - link

    Fair point but what I take from this review is that you are going to be spending pretty much double the cost or higher of ryzen for a proc that will have a 30% larger power envelope if you want higher performance. Intel is scrambling here, well done AMD. Reply
  • jjj - Monday, June 19, 2017 - link

    With 8 cores and up, thermal is a big issue when you OC Skylake X.. Power also to some extent.
    The 6 cores looks interesting vs the 7700k but not so much vs anything else. CPU+mobo gets you north of 600$ and that's a lot. If it had all the PCIe lanes enabled, there would be that but ,while plenty will buy it, it makes no sense to. And ofc there should be a Coffee Lake 6 cores soon , we'll see how it is priced- in consumer 6 cores with 2 mem chans is fine.
    More than 6 cores are priced way too high and , if you need many cores, you buy for MT not ST so ST clocks are less relevant.

    Intel moving in the same direction as AMD on the cache size front is interesting- larger L2 and smaller L3. Now they have "huge cache and memory latency issues"" just like Ryzen lol.
    W/e, Intel's pricing is still far too high and this platform remains of minimal relevance.
    Reply
  • ddriver - Monday, June 19, 2017 - link

    Funny thou, when Ryzen under-performed in games that was no reason to not publish gaming benches, in fact being the platform's main weakness there was actually emphasis put on that... but when it comes to intel we gotta have special treatment... Let's hear it for objectivity!

    Granted the 7800X finally brings something of relatively decent value, but still no good reason to justify the purchase unless one insists on an intel product, for the brand, for thunderbolt or hypetane support.

    "To play it safe, invest in the Core i9-7900X today."

    Really? With Threadripper incoming in a matter of weeks? For less than 1000$ you will get 16 zen cores. It will definitely beat the 7900X by a decent margin in terms of performance, plus the massive I/O capabilities and also ECC support, which I'd say is vital. That just doesn't sound like a honest recommendation. Not surprising in the least.
    Reply
  • ddriver - Monday, June 19, 2017 - link

    Also, on top of that we have launch prices for Ryzen rather than current prices. Looks like a rather open attempt to diminish AMD's platform value. Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    We've always posted manufacturer MSRPs in our CPU charts. There has been no official price drop from AMD; if you're seeing lower, it's being run from the distributor level.

    On the TR issue, we basically haven't tested it and don't know the price. Lots of variables in the air, which is why the words are /if you want to play it safe/. Safe being the key word there.
    Reply
  • ddriver - Tuesday, June 20, 2017 - link

    Dunno Ian, in my book this sounds more like hasty than safe. The safe thing would be to wait out. Even without the incipient TR launch, early adoption is rather unsafe on its own. As it is, it sounds more like an attempt to dupe people into spending their money on intel in the eve of the launch of a superior value and performance product from a direct (and sole) competitor.

    It is true that nothing is still officially known about TR, but based on the ryzen marketing strategy and performance we can make safe and accurate speculations. I expect to see the top TR chip launched at 999$ offering at the very least 30% of performance advantage over the 7900X in a similar or slightly higher thermal budget, of course in workloads that can scale nicely up with the core count.

    Comparing the 7900X to the 1800X, we have ~35% performance advantage for 205% the price and 150% the power usage. Based on that, it is a safe bet that TR is going to shine.
    Reply
  • fanofanand - Monday, June 26, 2017 - link

    Ian is a scientist, the less guessing the better. Give him an opportunity to review TR before giving suggestions. Doesn't that seem fair? Reply
  • t.s - Tuesday, June 20, 2017 - link

    Play it safe? Really?? Please. As if everyone in this world's stupid. Reply
  • Ranger1065 - Wednesday, June 21, 2017 - link

    There has never been a better time to give Intel the middle finger. Reply
  • slickr - Tuesday, July 04, 2017 - link

    I've been a long time user here and I can SAFELY say you got paid by Intel. How much did they pay you for this ridiculous review? Reply
  • nevcairiel - Monday, June 19, 2017 - link

    The Ryzen 7 launch review didn't have gaming benchmarks either. Reply
  • ddriver - Monday, June 19, 2017 - link

    That's true, my bad, I didn't remember AT's review in particular, but I remember in most reviews gaming was like 3/4 of the review... Reply
  • Spunjji - Tuesday, June 20, 2017 - link

    My thoughts exactly. Not bagging on AT specifically here, just review sites in general. A lot of them are giving out TBD on gaming performance with mentions of it being OK at 4K, whereas with Ryzen it was all "but it games badly at 1080p which people spending $500 on a processor will totally be aiming at". Reply
  • bongey - Wednesday, August 02, 2017 - link

    They said it in their conclusion "Gaming Performance, particularly towards 240 Hz gaming, is being questioned,"
    "workstation cpu"
    Reply
  • ash9 - Monday, June 19, 2017 - link

    Totally agree,
    I find it disingenuous by this site and many others that there's an INTENTIONAL over look to the fact that the 7900X runs 70W higher (PC Perspective) than the 6950X at load- any blind man could see Intel boosted the clocks on the 7900X for cosmetic benchmark wins and to make this lineup today look relevant. Take the 7900X out of the benches and the lineup today looks anemic. This is the BS that should not go unnoticed
    Reply
  • Alexvrb - Monday, June 19, 2017 - link

    Reminds me a bit of the pre-Conroe era. Maybe they should have revived the Extreme Edition name... Reply
  • sweetca - Tuesday, June 20, 2017 - link

    Some people actually read the reviews here because they are gathering information for an imminent decision.

    Not everyone wants to wait 3 weeks (maybe delays?), and then to play it safe wait another 3 weeks for the next thing, etc.

    I don't post often, but I was surprised how quickly the writer's integrity and honesty were attacked, considering they were making a subjective evaluation; "safe." I guess this is common now.
    Reply
  • Timoo - Saturday, July 01, 2017 - link

    To be honest: calling the i9 7900X a "safe bet" is not a scientific decision. The platform is far from perfect and the CPU runs hot when OC'd. It has been introduced 2 months in advance of the official release date, to beat TR. To me these 3 facts don't make it a "safe bet", more like a "daring endeavor to save Intel's face".

    So yes, I do understand the attacks, apart from the FanBoy's FlameBaits...
    Reply
  • Flunk - Monday, June 19, 2017 - link

    I'm surprised by how well the $249 Ryzen 5 1600x holds on in those benchmarks. Seems like the processor to go for, for the majority of people. It should keep up in games for years to come. Yes, the top-end stuff is great and all, but it's a < 1% product. Reply
  • prisonerX - Monday, June 19, 2017 - link

    Value for money seems to take a back seat to bragging rights for some people. Makes them look silly I think, but they seem to think it makes them look good. Reply
  • asendra - Monday, June 19, 2017 - link

    ?? In a professional setting, being 20-30% or whatever faster is well worth the 500-1000$ extra. Sure, it may only make that render 5/10min faster, But those gains sure add up over the course of a year.
    Gaining tens of hours of productivity over the course of a year sure is worth the extra $.
    Reply
  • Sarah Terra - Monday, June 19, 2017 - link

    So does the power bill. you'll note the "superior" intel profs have a much higher thermal rating. Reply
  • ScottSoapbox - Monday, June 19, 2017 - link

    People spending $999 on a CPU alone aren't worried about an extra few dollars on their power bill. Reply
  • Lolimaster - Tuesday, June 20, 2017 - link

    The thing AMD's Threadripper offers much more power for the same price or probably less, intel is not an option for workstation :D Reply
  • Timoo - Saturday, July 01, 2017 - link

    ThreadRipper is not available yet, so it's not an option. Yes, Intel rushed the X299 platform to beat AMD. Which makes it a "bad bet", to my opinion. But we simply cannot compare it to TR, as of yet. Intel in a workstation is very much an option. Just not one I would take :-) Reply
  • Integr8d - Tuesday, June 20, 2017 - link

    People $999 on a CPU to fill 1,000s of blades in a datacenter are definitely worried about a few dollars on their power bill... Reply
  • jospoortvliet - Thursday, June 22, 2017 - link

    Sure but this CPU is for work stations not blades. Epic and Xeon compete in that market.. Reply
  • melgross - Monday, June 19, 2017 - link

    Well, since one might expect to make at least tens of thousand on a single machine in a quarter, or more likely, a month, for a real business, considering depreciation, the extra costs are well worth it. In fact, they're negligible. Reply
  • FreckledTrout - Tuesday, June 20, 2017 - link

    If the 16 core Threadripper ends up being higher perfomant, cheaper, and using less power then it this argument wont matter. Im not sure it will win all three but it could. Reply
  • Lolimaster - Tuesday, June 20, 2017 - link

    <$1000 for 16 core Zen, spent your money there, worth the extra over the 8 core Ryzen. Reply
  • Flunk - Tuesday, June 20, 2017 - link

    No, it isn't. In a professional setting it's better to have more, less expensive systems rendering/serving/anything that just needs more processor time. For an additional $1000, I can double my performance rather than getting 20-30% more. Reply
  • FMinus - Thursday, June 22, 2017 - link

    Frankly I was looking at an upcoming 12 core CPU from either AMD or Intel for my render machine, with the EPYC prices announced, I could see myself going with two AMD Threadrippers 12 cores if they keep them under $800. I got most parts, just need motherboards and the chips, and if they really do keep the price under $800 for 12 core TR, I will get two systems for a bit more as just the 12 core CPU from Intel will cost. Granted since I got most other parts I spent that ahead, but still, two systems easily. And of course the power consumption will be higher, but still. Reply
  • barleyguy - Saturday, June 24, 2017 - link

    The 1600x has a 4.1 GHz XFR frequency, which requires good cooling but seems to kick in more than other Ryzen processors, likely because of two less cores. So on lightly threaded tasks without manual overclocking, the 1600x is a great choice.

    Manual overclocking changes the picture a bit though. In that case the 1600 and 1700 move up in bang for the buck, as does the i7 7700k.
    Reply
  • chrysrobyn - Monday, June 19, 2017 - link

    Zen isn't winning anything here, but they're showing up to the party. It's hard to ignore their prices, which are always lower than the Intel chips in the neighborhood (summed up in the conclusion with "Play it cheaper but competitive"), and their power -- 1/3rd less power than the Intel chips nearby -- which I didn't even see addressed? Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    I made the graph, forgot to write about it. Doing so now...

    (I always end up writing through the launch time :D)
    Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    OK sorry, done. I'm currently in another briefing for something else... Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    Missing the 7820x on the power draw graph. Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    The 7820X power numbers didn't look right when we tested it. I'm now on the road for two weeks, so we'll update the numbers when I get back. Reply
  • chrysrobyn - Monday, June 19, 2017 - link

    In my head I'm still doing the math on every benchmark and dividing by watts and seeing Zen looking very different. Reply
  • Old_Fogie_Late_Bloomer - Monday, June 19, 2017 - link

    I'm sure I'm wrong about this, but it makes more sense to me that the i9-7900X would be a (significantly) cut down HCC die instead of a perfect LCC. i9 vs i7, 44 vs 28 lanes, two AVX units instead of one?

    And yet the one source I've found so far says it's the smaller die. It's definitely the LCC die, then?
    Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    HCC isn't ready, basically. LCC is. Plus, having a 10C LCC die and not posting a top SKU would be wasteful of the smallest die of the set.

    Also, delidding a 10C SKU.
    Reply
  • Old_Fogie_Late_Bloomer - Tuesday, June 20, 2017 - link

    Well, it wouldn't be a waste if Intel's yields weren't good enough to get fully functional dies. The fact that Intel is not just releasing fully functional LCC chips but announced that they would be the first ones available suggests that they have no trouble reliably producing them, which is pretty impressive (though they have had plenty of practice on this process by now).

    Thanks for the response; I thoroughly enjoyed the review and look forward to further coverage. Exciting times!
    Reply
  • Despoiler - Monday, June 19, 2017 - link

    Considering Ryzen is in the desktop category and these Intel chips are HEDT, we need to wait to see what Threadripper brings. AMD won't have the clock advantage, but for multithreaded workloads I suspect they will have more cores at a cheaper price than Intel. Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    I wouldn't say AMD wont have a clock advantage once you get to the 14 and 16 core chips. They might not but you saw the power numbers and thermals, Intel very well may have to pull back the frequency as they scale up the cores more than AMD will. Reply
  • FMinus - Thursday, June 22, 2017 - link

    Actually I think it's the other way around. AMD might have clock advantage on higher core models thanks to not going with the monolithic approach. Easier to to cool those beasts but power is still an issue.

    If you imagine four 1800x on one interposer, you can see them reaching 4GHz on all of those dies, that said the power consumption would be massive, but easier cooler as the intel 16 core variant.
    Reply
  • Lolimaster - Tuesday, June 20, 2017 - link

    The 1995X will have a stock 3.6Ghz for the 16cores, same as the 7900X with just 10. Reply
  • geekman1024 - Monday, June 19, 2017 - link

    Zen is winning in one department: Price. Reply
  • Lolimaster - Tuesday, June 20, 2017 - link

    Ryzen has a sick efficiency at lower clocks, that Ryzen 7 1700 65w can de undervolted further more and make it a 50w 3Ghz monster. Reply
  • sir_tech - Monday, June 19, 2017 - link

    Why there are no power consumption charts in the review? Also, you should have gone ahead and post the gaming performance charts also just like Ryzen reviews.

    While the MSRP is high the actual retail price for Ryzen processors retail prices are much lower now.

    Ryzen 7 1800x - $439 (MSRP - $499)
    Ryzen 7 1700x - $349 (MSRP - $399)
    Ryzen 7 1700 - $299 (MSRP - $329)
    Ryzen 5 1600x - $229 (MSRP - $249)
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    "Why there are no power consumption charts in the review?"

    Please refresh the conclusion.=)

    "Also, you should have gone ahead and post the gaming performance charts also just like Ryzen reviews."

    The BIOS updates have come so late that we don't even have a complete dataset for the new BIOSes. Ian had just enough time to make sure they were still screwy, and then was on a plane. We're going to need to sit down and completely redo all the Skylake-X chips once the platform stabilizes to the point where our results won't be immediately invalidated.
    Reply
  • cheshirster - Monday, June 19, 2017 - link

    Your DDR4-2400 tests of 1800X and 1600X are already invalidated.
    And RoTR
    There was no problem with publishing bad gaming results for AMD.
    What's the problem with 2066?
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    If we had a complete, up-to-date dataset to publish, and time to write it up, we would have. If only to showcase why eager gamers should wait for the platform to mature a bit. Reply
  • cheshirster - Monday, June 19, 2017 - link

    Sorry, with this text:
    "Our GTX1080 seems to be hit the hardest out of our four GPUs, as well as Civilization 6, the second Rise of the Tomb Raider test, and Rocket League on all GPUs. As a result, we only posted a minor selection of results, most of which show good parity at 4K"
    + ryzen bad fullhd results in RoTR and Rocket League fully published.

    You are going straigh to the Hall of Fame of typical brand loyalists.
    Reply
  • jospoortvliet - Thursday, June 22, 2017 - link

    Well the state of Ryzen wasn't as bad as this and it isn't like it was not pointed out in this review.

    Also I am sure other benchmarks were also affected making Intel look worse in benchmark databases thanks to their rush job...
    Reply
  • bongey - Wednesday, August 02, 2017 - link

    Yep you bashed Ryzen in gaming in your review, quit lying.
    "Gaming Performance, particularly towards 240 Hz gaming, is being questioned,"
    Reply
  • Gasaraki88 - Monday, June 19, 2017 - link

    Everything is on default, no overclocking. Reply
  • AnandTechReader2017 - Monday, June 19, 2017 - link

    And no load/idle? And clockspeed at full load?
    Also, no mention of Intel drawing more than the rated load?
    Reply
  • AnandTechReader2017 - Monday, June 19, 2017 - link

    Sorry, missed the paragraph under on refresh. Reply
  • wolfemane - Monday, June 19, 2017 - link

    Seriously... you don't post gaming results because of bios issues? Seems RYZEN had issues due to premature bios as well but that sure as hell didn't keep you all from posting results anyways.

    I was going to give kudos to you guys as I was reading the article for excluding gaming reaults on workstation CPU's. But nope, you had to add that little blurb there at the end.

    Pretty god damn shameful if you ask me. Post the results, then do a follow up when venders get their crap together and release reliable bios. The same venders that blamed AMD for their inability to create half way decent bios.
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    Unfortunately it's a bit of a damned if you do, damned if you don't situation for us. The Skylake-X platform is still quite immature in some respects, and Intel did not give us a ton of lead-time in testing it. The BIOSes only came in a bit before Ian had to get on a plane. So we've been racing the clock for the past week trying to pull things together.

    What we do have are a set of incomplete data that still shows some issues. But we need time to further validate that data, and even more time to write about it. Both of which have been in short supply over the last few days.

    Mentioning gaming at all is because we wanted to point out that there are still issues, and that anyone who is primarily interested in gaming is likely best served by waiting, rather than jumping on Intel's pre-orders.
    Reply
  • wolfemane - Monday, June 19, 2017 - link

    I GET where you are coming from. But you (Anandtech) had no issues posting AMD Ryzen results knowing full well that the x370 platform was far *FAR* from mature. Anandtech and other reviewers didn't hesitate to mention all the bugs possibly holding the platform back. But you still posted the results. As you should have. Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Err, what? Our Ryzen 7 review did not have gaming benchmarks.
    http://www.anandtech.com/show/11170/the-amd-zen-an...
    Reply
  • wolfemane - Monday, June 19, 2017 - link

    Well... guess how incredibly stupid I feel? If I could retract my comment I would. I'll go reread the launch review again. Reply
  • cheshirster - Monday, June 19, 2017 - link

    No need to apologize
    Here are gaming tests on DDR4-3000 @ 2400 downclocked memory
    With Ryzens BADLY underperforming in RoTR and Rocket League pretty much published
    http://www.anandtech.com/show/11244/the-amd-ryzen-...
    And for Intel they write this
    "Our GTX1080 seems to be hit the hardest out of our four GPUs, as well as Civilization 6, the second Rise of the Tomb Raider test, and Rocket League on all GPUs. As a result, we only posted a minor selection of results, most of which show good parity at 4K"
    Reply
  • DanNeely - Monday, June 19, 2017 - link

    Check the dates; that was published about 5 weeks after the initial Zen review that Ian linked to above. The initial one didn't have any gaming data yet; because in both cases the release day situation was too broken. Reply
  • melgross - Monday, June 19, 2017 - link

    Yeah, it's usually considered to be proper to first read about what you're commenting upon. Reply
  • wolfemane - Monday, June 19, 2017 - link

    First off, comments like yours contribute to absolutely nothing. Making whatever you say completely useless and more appropriate for deleting rather than individuals coming to conclusions based on what they read. At least they are posting on the topic at hand.

    Second, I read the article, and it was well done. My comments were directed at the very end of their conclusion and was basing my comments on a review that came out a few months after the original ryzen review. I got my articles mixed up, owned up to my mistake, and apologized.

    What are you doing? Trolling....? How about adding something creative to the conversation instead of posting utterly pointless and useless dribble? Grow the F up.
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    Wolfe, nested comments only display to 5 deep. They were responding to cheshirster, not you.=) Reply
  • bongey - Wednesday, August 02, 2017 - link

    Don't be, they hammered Ryzen in gaming performance in their conclusion, even without benchmarks.That is clear evidence of shilling for Intel, following a narrative without any evidence.
    "Gaming Performance, particularly towards 240 Hz gaming, is being questioned,"
    "AMD has a strong workstation core "
    Reply
  • cheshirster - Monday, June 19, 2017 - link

    See here
    http://www.anandtech.com/show/11244/the-amd-ryzen-...
    fullhd
    i5 7600 - 139fps
    1800X - 99fps
    http://www.anandtech.com/show/11244/the-amd-ryzen-...
    Rocker League fulhd
    i5 7500 - 188fps
    1800X - 132fps

    And now they write
    "Our GTX1080 seems to be hit the hardest out of our four GPUs, as well as Civilization 6, the second Rise of the Tomb Raider test, and Rocket League on all GPUs. As a result, we only posted a minor selection of results, most of which show good parity at 4K"

    RoTR and GL, same games, same bad results, just different brands and now they are not going to publish them.
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    It's important to note that the articles you quote are from the Ryzen 5 launch, which was over a month after the X370 platform. A lot of Ryzen's issues had been fixed in the weeks before. Reply
  • bongey - Wednesday, August 02, 2017 - link

    In your conclusion intel shill
    "Gaming Performance, particularly towards 240 Hz gaming, is being questioned,"
    "AMD has a strong workstation core "
    Reply
  • koomba - Thursday, July 06, 2017 - link

    Uhh, not sure what you are remembering, but Anandtechs initial Ryzen review most certainly did NOT include gaming benchmark.

    I think it's slightly amusing how many people here in the comments immediately jumped down the reviewers throat over no gaming reviews and the reason given for that. And then they proceed to spin that into some kind of perceived bias against Ryzen, like the author has some AMD bashing agenda.

    You, and several others, are literally inventing "facts" to support accusations of bias and unequal treatment. Then to top it off, trying to say Anandtech reviewers are fan boys.

    But in reality, the entire basis of all these claims of bias, etc is completely fabricated. So much for all that huh? Almost seems like overly defensive, some might even say fan boy behavior. Irony is present. lol.
    Reply
  • bongey - Wednesday, August 02, 2017 - link

    Nope they just bashed Ryzen in gaming in the conclusion even without benchmarks.
    "Gaming Performance, particularly towards 240 Hz gaming, is being questioned,"
    "AMD has a strong workstation core "
    Reply
  • Slappi2 - Monday, June 19, 2017 - link

    Wow AMD gets stomped here. No way I would buy an AMD CPU after seeing that. Reply
  • R0H1T - Monday, June 19, 2017 - link

    Sure now enjoy your 10 core space heater ~
    www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092-11.html
    Reply
  • Slappi2 - Monday, June 19, 2017 - link

    Running 1080s, doubt I'll notice. Reply
  • Luckz - Monday, June 19, 2017 - link

    That's an impressive review / article considering the source. Reply
  • prophet001 - Monday, June 19, 2017 - link

    The TDP on the chip is 140W. If they can't cool it then there's a problem with the heat spreader.

    How well it overclocks is another point of discussion separate from this one.
    Reply
  • prisonerX - Monday, June 19, 2017 - link

    And yet the only reason you're seeing it at all is becuase of AMD. You epitomise the pure, untrammeled genius of typical Intel customers. Reply
  • Spunjji - Tuesday, June 20, 2017 - link

    This is lost on Slappi. Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    Slappi2, you reading the same charts? Also have you seen game reviews on other sites. The 1800x holds its own pretty well compared to t he 7820x 8-core, yes the 7820x is faster but not by a huge margin. We don't have Thredripper just yet to see AMD's 10 core comparison so saying Intel's 10 core is killing AMD's 8-core models is a bit disingenuous. Reply
  • AnandTechReader2017 - Monday, June 19, 2017 - link

    The review is missing power consumption, clock speed and current price.
    At no point is it stated what clock speed anything is running at, it could be that Ryzen is running at base and Intel running overclocked (as speedshift would allow). There are also no comparisons with speedshift on/off.

    There is no mention of power draw, the Intel processors are all using a 112/130/140W envelope, but are probably way below that with base, while Ryzen is 95/65W with no mention if overlocked above that envelope or underclocked.

    If you check Amazon, one can get the R7 1800X for $439 versus the $499 they posted.

    Also, the top Intel chips are competing with Thread Ripper, not Ryzen 7.

    You're probably going to accuse me of fanboyism, I just don't like it if so many test details are missing. There is also no mention of temperature at max load, etc.
    Reply
  • AnandTechReader2017 - Monday, June 19, 2017 - link

    To add: They also have 2666MHz and 3000MHz RAM, yet they don't state what clock speed the test is run at.

    There is so much information missing.
    Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Power consumption is added. Sorry, I'm currently half-way around the world suffering jet lag - I had the graph compiled, I just forgot to write about it. It's been added.

    Clock Speed: As per the first page.
    Current Pricing: As per the first page.

    With regards the recent price drops from AMD on the Ryzen chips - everything seems to point that this is distributor driven. We've not seen anything official from AMD (such as price lists) that confirm an official price drop. If you have a link to that, please share.

    On the DRAM: our standard policy as with every review is to run JEDEC for the maximum officially supported frequency. Nothing changes in this review. So that's DDR4-2666 for the 7900X and 7820X, and DDR4-2400 for the 7800X.
    Reply
  • Tamz_msc - Monday, June 19, 2017 - link

    What benchmark was used for the power consumption data? Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Prime95 Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Are you sure the numbers are correct as the i7 6950X on your graph here states less than the 135W on your original review of it under an all-core load. Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    We're running a new test suite, different OSes, updated BIOSes, with different metrics/data gathering (might even be a different CPU, as each one is slightly different). There's going to be some differences, unfortunately. Reply
  • gerz1219 - Monday, June 19, 2017 - link

    Power draw isn't relevant in this space. High-end users who work from a home office can write off part of their electric bill as a business expense. Price/performance isn't even that much of an issue for many users in this space for the same reason -- if you're using the machine to earn a living, a faster machine pays for itself after a matter of weeks. The only thing that matters is performance. I don't understand why so many gamers read reviews for non-gamer parts and apply gamer complaints. Reply
  • demMind - Monday, June 19, 2017 - link

    This kind of response keeps popping up and is highly short sighted. Price for performance matters to high end especially if you use it for your livelihood.

    If you go large-scale movie rendering studios will definitely be going with what can soften the blow to a large scale project. This is a fud response.
    Reply
  • Spunjji - Tuesday, June 20, 2017 - link

    Power efficiency will matter again when Intel lead in it. Been watching the same see-saw on the graphics side with nVidia. They lead in it now, so now it's the most important factor.

    Marketing works, folks.
    Reply
  • JKflipflop98 - Thursday, June 22, 2017 - link

    Ah, AMD fanbots. Always with the insane conspiracy theories. Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Power draw is important as well as temps, it will allow you to push to higher clocks and cut costs.
    Say your work had to get 500 of these machines, if you can use a cheaper PSU, cheaper CPU and lower power use, the saving could be quite extreme. We're talking 95W vs 140W, a 50% increase versus the Ryzen. That's quite a bit in the long run.

    I run 4 high-end desktops in my household, the power draw saving would be quite advantageous form me. All depends on circumstances, information is king.

    Ian posted that everything is running at stock speeds, each version overclocked with power draw would also be interesting, also the difference different RAM clock speeds make (there was a huge fiasco with people claiming nice performance increases by using higher RAM clocks with the Ryzen CPU, how much is Intel's new line-up influenced? Can we cut costs and spend more on GPU/monitor/keyboard/pretty much anything else?)
    Reply
  • psychok9 - Sunday, July 23, 2017 - link

    It's scandalous... no one graph about temperature!? I suspect that if it had been an AMD cpu we would have mass hysteria and daily news... >:(
    I'm looking for Iy 7820X and understand how can I manage with an AIO.
    Reply
  • cknobman - Monday, June 19, 2017 - link

    Nope this CPU is a turd IMO.
    Intel cheaped out on thermal paste again and this chip heats up big time.
    Only 44PCIE lanes, shoddy performance, and a rushed launch.

    Only a sucker would buy now before seeing AMD Threadripper and that is exactly why, and who, Intel released these things so quickly for.
    Reply
  • Gothmoth - Monday, June 19, 2017 - link

    shoody performance.. what are you talking about? stupid games?
    bios updates will fix that.

    could not care less about games. but the intels are faster.. no way around it.
    more pricey but faster.
    Reply
  • Flying Aardvark - Monday, June 19, 2017 - link

    You nailed it. Between the temps and power draw, to jump on this lineup is really silly. I like the 1700 for a small air cooled mITX setup. If I moved to anything else I'd dump all this stuff in the middle and go straight to Threadripper.

    If you're going for thread count, do it right and get 16C/32T. Or just stick to a nice cool and quiet R5 or R7.
    Reply
  • cocochanel - Monday, June 19, 2017 - link

    How do they get stomped ? AMD power consumption is about half of that of Intel's.
    Can you please explain ?
    Reply
  • Yongsta - Monday, June 19, 2017 - link

    Wow, comparing $1000+ high end enthusiast desktop parts vs $500 and lower consumer desktop parts. Wait for Threadripper and Ryzen7 right now offers a lot more bang for the buck (if you get the 1700 and overclock it). Reply
  • tarqsharq - Monday, June 19, 2017 - link

    Those multi-threaded benchmarks are going to get really ugly for Intel in a few months I think, especially from a bang for buck perspective. Reply
  • T1beriu - Monday, June 19, 2017 - link

    Wrong name: derba8ur

    Real name: der8auer

    Page 6.
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    Thanks! Reply
  • jjj - Monday, June 19, 2017 - link

    A lot of talk about the mesh but not testing it, at least the basic memory BW, latency and scaling.
    No power numbers at all? No OC and temps....
    Why focus on perf and ignore all else when perf is more or less a known quantity and the unanswered questions are elsewhere.

    For Intel you list all Turbo flavors, for AMD you forget XFR when comparing SKUs.
    Reply
  • Luckz - Monday, June 19, 2017 - link

    http://www.tomshardware.com/reviews/intel-core-i9-... has you covered re the mesh Reply
  • jjj - Monday, June 19, 2017 - link

    PCPer tries to look at it too. Reply
  • rascalion - Monday, June 19, 2017 - link

    Are the Ryzen numbers in the charts retests using the last round of bios and software updates? Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    As much as possible, the latest BIOSes are used.
    Our CPU testing suite is locked in for software versions as of March 2017. This is because testing 30/50/100+ CPUs can't be done overnight, we have to have rigid points where versions are locked in. My cycle is usually 12-18 months. (Note I'm only one person doing all this data.)
    Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    Ian any chance once there are a few BIOS tweaks you could say do a mini updated review on the 7820x vs Ryzen 1800x. With Ryzen having latest BIOS as well plus 3200Mhz memory. I'm just curious really how the 8-core guys line up when some of the dust settles and I think a lot of people will be. Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Any reason why 3200? I'll have Intel people saying it is pushing the Ryzen out of spec Reply
  • jjj - Monday, June 19, 2017 - link

    You could do a memory subsystem scaling review for all platforms, Skylake X, Threadripper, Ryzen (Summit Ridge) and Coffee Lake. Cache, interconnect, DRAM. See where they are, how they scale, where the bottlenecks are, single rank vs dual rank modules and perf impact in practice.Why not even impact on power and efficiency.

    In any case, you'll need to update Ryzen 5 and 7 results when Ryzen 3 arrives , isn't it?

    For DRAM at 3200 it might be out of spec - overclocking the core is out of spec too but that has never stopped anyone from overclocking the memory. Right now 3200 is what a lot of folks buy , at least for higher end mainstream Ofc some will argue that Ryzen scales better with memory and that's why it is unfair but it's a hell of a lot more reasonable than testing 1080p gaming with a 1080 TI since it's a popular real world scenario.

    At the end of the day the goal should be to inform, not to watch out for Intel's or AMD''s feelings.
    Reply
  • vanilla_gorilla - Monday, June 19, 2017 - link

    >For DRAM at 3200 it might be out of spec - overclocking the core is out of spec too but that has never stopped anyone from overclocking the memory.

    This. Exactly. We're enthusiasts and we always push the envelope. No one cares what the specs are all we care is about what these processors are capable of in the right hands.

    And Ian I think you guys do an awesome job, there's no other place I look for CPU benchmarks. Keep up what you do, we all appreciate it, as well as your willingness to have a dialog with us about the process. Really cannot say how impressed I am by how open and engaged you are, it's really commendable.
    Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    Thanks for the comments :)

    Though on your comments about pushing things out of spec. We have a good deal of readers who want plain stock for their businesses - AT isn't solely a consumer focused site. Otherwise I'd just jack all the CPUs and just post OC results :D Our base testing will always be at stock, and for comparison testing there has to be an element of consistency - testing an OC'ed part against a stock part in a direct A vs B comparison is only going to end up with a barrage of emails being rammed down my throat. There has to be some planning involved.
    Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    I've been planning a memory scaling article, I just haven't had the time (this article was around 6 weeks of prep with all the events going on that I had to attend).

    Note we don't retest stuff every review. With our new 2017 test suite, I've been going through regression testing. Usually regression testing is done once for the full segment until the benchmarks are changed again. I'll look at my next few months (still stupidly busy) and look at the priorities here.
    Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    Most people can easily buy a 3200 kit for not a lot of extra money. It doesn't take a lot tweaking(well not anymore on AGESA 1.0.0.6) or silicone lottery like an OC, just a bit more cash. From what I have seen with Ryzen it is the sweet spot on price and performance. I would assume Its likely the most chosen configuration on the R7's. To make it fair use 3200 on the 7820x as well. I only ask because Ryzen did way better than I would have thought and would like to see it with 3200Mhz memory and latest updates to see really how close Intel and AMD are on 8-core systems. Then im going to build :) Reply
  • tipoo - Monday, June 19, 2017 - link

    Launch review! Nice work dude(s). Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Thanks! Reply
  • Cellar Door - Monday, June 19, 2017 - link

    Good review Ian! But you would agree that this launch feel very 'half-baked'?

    I don't think there ever was an Intel launch with this many issues on the platform at start.
    Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    "I don't think there ever was an Intel launch with this many issues on the platform at start."

    Intel always rolls out some new technology on their HEDT processors, which usually invokes teething issues. X99/HSW-E was the first DDR4 platform, and X79/SNB-E launched with some pretty gnarly PCIe 3.0 issues. So X299/SKL-X is fairly similar to past launches, for better or worse.
    Reply
  • Drumsticks - Monday, June 19, 2017 - link

    Still working my way through some of the review, but: "The latest KNL chips use 72 Pentium-class" on the microarchitecture analysis page is wrong. Knights Corner was derived from a Pentium but KNL is derived from Silvermont. Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    Thanks! Reply
  • Drumsticks - Monday, June 19, 2017 - link

    Just finished. Thanks for the review. I look forward to seeing updated gaming benchmarks. Would have loved to see what you had similar to Kaby Lake X showing up here and there, but I understand not wanting to show incomplete data. I trust the editorial integrity of Anandtech more than the comment section, so I'm not crying foul. Can't wait for the follow up! Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Some Kaby-X data is in Bench, mostly the CPU stuff. I need to replace my Kaby i7 that failed. Reply
  • Drumsticks - Monday, June 19, 2017 - link

    Will take a look then, thank you! Reply
  • Vanquished - Monday, June 19, 2017 - link

    Never visiting your site again. You make excuses for Skylake-X crappy gaming performance but were more than happy to bash AMD for the same issue.

    What a load of crap, shame on you.
    Reply
  • Slappi2 - Monday, June 19, 2017 - link

    Wow. RageQuit because you don't like the truth. It's a CPU! Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    We didn't post gaming data in our launch Ryzen 7 review for the same reason. You are applying double standards.

    http://www.anandtech.com/show/11170/the-amd-zen-an...
    Reply
  • melgross - Monday, June 19, 2017 - link

    Man, another guy who didn't actually read the article, but reads other poster's remarks who also didn't read the article. Can't we just deleted these jerks remarks? Reply
  • koomba - Thursday, July 06, 2017 - link

    Please go back and tag my reply on page 7. Short version: you are wrong, they did NOT do gaming benchmarks on their launch Ryzen review either.

    So quit whining about something that DIDN'T HAPPEN and using it as a weak excuse to bash this site. Your blatant AMD fan boy agenda is pathetic.
    Reply
  • nicolaim - Monday, June 19, 2017 - link

    Typos. Second table on first page says i7 instead of i9. Reply
  • nicolaim - Monday, June 19, 2017 - link

    And incorrect MSRP for Ryzen 7 1800X. Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    Please be sure to reload. Both of those issues on the first page were corrected some time ago. Reply
  • Bulat Ziganshin - Monday, June 19, 2017 - link

    I have predicted details of AVX-512 implementation 1.5 years ago when SKL-S microarchitecture was described in Intel optimization manual. These are details:
    http://www.agner.org/optimize/blog/read.php?i=415#...
    Reply
  • Einy0 - Monday, June 19, 2017 - link

    Very disappointed that AT did not publish game benchmarks because they didn't show Intel in the best light but had no problem making a big deal about Ryzen's gaming issues. This isn't the brand of journalism that Anand built this site on. It's certainly not what attracted me to the site and has had me coming back for 20 years. I come for unbiased straight shooting PC technology reviews. Now we get a mobile focus and biased PC hardware reviews. Not to mention the full screen popup ads and annoying hover ads that refuse to go away. How far the mighty have fallen! Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    We never posted Ryzen 7 gaming benchmarks in our launch review for the same reason. Please go back and check:

    http://www.anandtech.com/show/11170/the-amd-zen-an...
    Reply
  • melgross - Monday, June 19, 2017 - link

    You know, it almost doesn't pay to respond to these guys. They're AMD fanboys who are too lazy to read the article first, and they won't read your link either, because they don't want to. They want to believe what they say, no matter what. Reply
  • Einy0 - Monday, June 19, 2017 - link

    Ian, I stand corrected and apologize. I think I allowed a previous poster's anger affect my thoughts on the subject. I'm still very confused as to why you would not publish results on both platforms while including a note in regards to gaming performance. Not that these chips are for gaming but many of us use our PCs as an all purpose computing platform and gaming is frequently included in the mix. Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    "I'm still very confused as to why you would not publish results on both platforms while including a note in regards to gaming performance. "

    1) Lack of time. The most recent BIOS update came close to the launch, and we haven't yet had enough time to fully validate all of our data.

    2) Right now gaming performance is all over the place. And with Intel doing pre-orders, by the time you got your chips there's a good chance there will be another BIOS revision that significantly alters gaming performance.
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    the whole article feels rushed to be honest.

    there is basically no talk about the insane powerdraw and temps when overclocked.
    Reply
  • jardows2 - Monday, June 19, 2017 - link

    Let me give you the conclusion ahead of time. If you are buying a gaming chip, buy the i7-7700K.

    These are not gaming chips, they are work chips that can do gaming. It's like buying a 1-ton diesel truck, and wanting to floor it at the stoplight. It'll do it, but a Mustang or Camaro will do that better. The truck will be able to haul pretty much anything, that the pony cars will blow their transmissions on.

    Ryzen, on the other hand, is all AMD has, and so gaming results are very relevant to the discussion.
    Reply
  • prophet001 - Monday, June 19, 2017 - link

    ^ This Reply
  • Hurr Durr - Monday, June 19, 2017 - link

    I`d rather wait for the next iteration, whatever Lake that was. Hopefully 6 cores will step down into the mainstream, and then there is 10 nm. Reply
  • koomba - Thursday, July 06, 2017 - link

    How.many.times.does.it.have.to.be.said? They did NOT post gaming benchmarks on their first Ryzen review either! You are seriously at least the 10th person who has come on here spouting this COMPLETE falsehood, and using it to bash this site or claim some kind of bias.

    Please do some research before you just talk nonsense and base your entire argument around something that isn't true.
    Reply
  • Gasaraki88 - Monday, June 19, 2017 - link

    Thank you for this article. I knew I could count on Anandtech to write a detailed article on the new Intel cpus, how everything on them worked, like the caches and turbo core 3.0, etc. not just benches. Reply
  • marcis_mk - Monday, June 19, 2017 - link

    Ryzen R7 1700 has 24 pci-e lanes (20 for PCI-E dGPU and 4 for storage) Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    AMD has a tendency to quote the sum of all PCIe. We specifically state the PCIe root complex based lanes for GPUs. Ryzen has 16 + 4 + 4 - root complex, chipset, IO. Threadripper has 60 + 4: root complex(es) and chipset. Skylake-S has 16 + 4 - root complex, DMI/chipset. Etc. Reply
  • halcyon - Monday, June 19, 2017 - link

    Thank you for the review. The AVX512 situation was a bit unclear. Which models have which AVX512 support of the now released chips (and the future, yet to be released chips). It would be interesting to see AVX512 specific article once the chips are out and we have (hopefully) some useful AVX512-optimized software (like encoding).

    For myself, decision is easy now to postpone, until ThreadRipper is out. The thermals are just out of whack and for my workloads, the price is not justified. Here's hoping Threadripper can deliver more for same price or less.
    Reply
  • Bulat Ziganshin - Monday, June 19, 2017 - link

    all skl-x has avx-512 support. i9 cpus has double fma512 engines, but as i guess - that's only difference, i.e. remaining 512-bit commands (such as integer operations) will have the same throughput on i7 and i9
    but i may be wrong and it will be really very ibteresting to check throughput of all other operatins. probably Anand can't do that and we will need to wait until Agner Fog will reach these spus
    Reply
  • halcyon - Monday, June 19, 2017 - link

    Thanks. Many questions remain: AVX512F vs AVX512BW, which are supported now, which in the future? What is the difference? How does it compare to Knights Landing? Are number of AVX units tied to number of cores? What is the speed differential in AVX512 loads? What is the oc limitation from AVX512 support? etc. Reply
  • nevcairiel - Monday, June 19, 2017 - link

    Those Knights Landing/Knights Mill specific AVX512 instructions are actually very specific to the work-loads you would see on such a specialized CPU. The instructions chosen for Skylake-X and future Cannon Lake more closely match what we already know from AVX/AVX2.

    For the "types" of AVX512, basically its split into several sub-instruction sets, all containing different instructions. AVX512 will always include F, because thats the basis for everything (instruction encoding, 512-bit registers, etc). BW/DQ include the basic instructions we know from AVX/AVX2, just for 512-bit registers. SKL-X supports all of F, CD, BW, DQ, VL.

    Wikipedia on AVX-512 has some more info on the different feature sets of AVX-512:
    https://en.wikipedia.org/wiki/AVX-512

    Its generally safe to just ignore the Knights Landing specific instructions. They are very specific to the workloads on those systems. The AVX-512 subset used for Xeon "Purley" and SKL-X is more inline with the AVX/AVX2 instructions we had before - just bigger.

    For software, x264 for example already got some AVX512 optimizations over the recent weeks, it might be interesting to test how much that helps once all the launch dust settles.
    Reply
  • halcyon - Tuesday, June 20, 2017 - link

    Thank you very much! Reply
  • satai - Monday, June 19, 2017 - link

    Any idea, why is Intel so much better at Chrome compilation? Reply
  • IanHagen - Monday, June 19, 2017 - link

    I'd like to know that as well. Ryzen does particularly fine compiling the Linux Kernel, for example, as seen in: http://www.phoronix.com/scan.php?page=article&... Reply
  • tamalero - Monday, June 19, 2017 - link

    Optimizations?
    I still remember when Intel actively paid some software developers to block multicore and threading on AMD chips to boast about "more performance" during the athlon X2 days.
    Reply
  • johnp_ - Tuesday, June 20, 2017 - link

    It's just the enterprise parts. Kaby Lake S is behind Ryzen:
    http://www.anandtech.com/show/11244/the-amd-ryzen-...
    Reply
  • Gothmoth - Monday, June 19, 2017 - link

    if powerdraw and heat would not be so crazy i would buy the 8 core today.

    but more than the price these heat issues are a concern for me.. i like my systems to be quiet....
    Reply
  • zlandar - Monday, June 19, 2017 - link

    Glad AMD is lighting a fire under Intel's complacent butt. Reply
  • SaolDan - Monday, June 19, 2017 - link

    Can these chips be overclocked? If so can we get some charts? This is my go to site for tech. Reply
  • Despoiler - Monday, June 19, 2017 - link

    According to Tom's these things are dogs. AIOs can only handle stock frequencies and Prime95 runs.

    http://www.tomshardware.com/reviews/intel-core-i9-...
    Reply
  • Archie2085 - Monday, June 19, 2017 - link

    @ Ryan @ Ian

    How Come no one is talking about the Power Draw or Performance Per Watt per $$. If that is the Metric Looks Like AMD has a winner achieving 80% Results with a 8 Core Processor Against 10 Core Giants Not to mention at a Fraction of Cost

    The Release under such a tearing Hurry Definitely looks like a Knee Jerk Reaction of Zen and Impending Release of Thread Ripper...

    I would Expect you to Give a proper review of Thread ripper just as you have not Made any Meaningful mention of it in this article even though Performance leaks have started...

    Looks More Like a Marketing Article than a Review of Pros and Cons.
    Reply
  • BrokenCrayons - Monday, June 19, 2017 - link

    What's with the odd capitalizations? Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    I'm not sure how much Performance Per Watt comes into play here but it should at least some. Mostly in the workstation and desktop market its price vs performance. The data center CPU's performance per watt is what it is all about and Intel is in for some trouble if this is a trend that continues into the data center CPU's. AMD may really have some pretty good CPU's in thread ripper and EPYC. Reply
  • Archie2085 - Monday, June 19, 2017 - link

    HEDT practically is a workstation chip used for workstations. For Freelancers cost Vs extracted value should have a bearing on the choice Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    No argument that is what I was hinting at wen I said price vs performance. The perf vs power draw ins't top of most workstation users lists, it is mostly how much do I pay for this nn percent improvement in rendering times for example. The performance per watt doesn't matter to to many outside of data centers, well it matters to everyone, but to a much larger degree in data centers. Reply
  • GeorgeH - Monday, June 19, 2017 - link

    Thanks for the review. I generally enjoy your perspective, and try to remember to support you by using a browser with no ad blocking, but have to say your website is almost unusable using Microsoft Edge because of the ads (on my ultrabook with an i5-5300u). I had to switch to an ad blocking browser mid stream to actually read your content, which isn't good for anyone. Reply
  • Silma - Monday, June 19, 2017 - link

    I was disappointed to find no information on overcloackability, or did I read too fast.
    I would expect a lesser overcloackability since the frequency & TDP are already pushed.
    I'd be glad to be proven wrong.
    Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    Didn't get a chance to do overclocking. Testing 5 chips rather than one chip in less than a week (with BIOS issues) means I haven't slept much, and now I'm at a different event half way around the world. Reply
  • Ian Cutress - Monday, June 19, 2017 - link

    I should add I have some 5 GHz numbers on Kaby i7. I need to find time to write but I'm fully booked today :( Reply
  • FreckledTrout - Monday, June 19, 2017 - link

    Cool! (Pun intended) Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Could you also test with Speedshift on/off? Would be interesting how much of an impact it has. Reply
  • lefty2 - Monday, June 19, 2017 - link

    One thing that is never covered by any of these reviews is the efficiency of the CPU. If you measure performance of a benchmark, then divide by the the power used in said benchmark, you will see the most efficient CPU by far is the R7 1700. All Intel Skylake-X and Kaby Lake CPUs are far less efficient (also the R7 1800X for that matter). Reply
  • Archie2085 - Monday, June 19, 2017 - link

    @BrokenCrayons

    Oops .. I did not realise how odd it looked.

    On a different note . tom's hardware a sister site has done a balanced review including gaming benches.. Still looks like a rushed product :)

    Intel has to work on that god awful "TIM"
    Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Wouldn't trust toms hardware since that fiasco with only Intel chips recommended for everything when the R7 1700 was clearly better a better choice. Reply
  • lordken - Wednesday, July 19, 2017 - link

    though they do include ryzen as of next update to their CPU recommend guide. They probably re-evaluated after that shitstorm in their comments :)
    btw how that fits into your perspective? If they are intel biased how that they did beat i9 quite nicely.

    Though I was surprised also that AT didn't talk more about power and heat issues as I did first read Toms article http://www.tomshardware.com/reviews/-intel-skylake... and after I had good lols especially after reading this fabulous line
    "Ultimately, we’re looking at power consumption numbers similar to some high-end graphics cards when we start messing with Skylake-X. AMD’s FX-9590X doesn’t even come close to these results"
    I came here and was surprised to not read anything here, as thermals and heat looks to be pretty tragic...
    If AMD would came up with such space heater it would be all over the place...but since its intel it seems to be no issue.
    Reply
  • Kevin G - Monday, June 19, 2017 - link

    I would be hesitate to indicate that the memory controller's size is tied the same size as a CPU core for tiling purposes. Intel could easily produce a double wide, half height memory controller and place them at the ends of a column. Intel could also start putting memory controllers on two routers to remove a hop-or-two to cut down on latencies on-die. Ditto for coherecy links and IO controllers. They don't have to be rectangular and the same size of a core for optimal placement.

    In your mock up of the 5x6 arrangement, there is a lot of wasted space that could be negated if Intel were to re-arrange the dimensions of the IO and memory controllers a bit. Your estimate of a 677 mm^2 is spot on with what you've gone but there is incredible pressure to reduce such huge dies to make them easier to manufacturer. There is simplicity in keeping IO and memory controllers the same size as a core for rapid construction of the entire die but I think the trade off would favor smaller die size here.
    Reply
  • Communism - Monday, June 19, 2017 - link

    If you are going to post "power consumption" and "power efficiency" graphs/analysis, you need to post the performance from the "power consumption" test itself, or else it's pretty pointless when comparing vastly different CPUs.

    You mentioned you tested power consumption with prime95.

    I'd wager the intel has a hilariously high performance per watt in prime95 as it would likely be using avx2 instructions (and the massive memory bandwidth of the cache as well as the massive memory bandwidth of the main memory/IMC combination if you are using larger dataset option).

    I wonder how many hundreds of pages of AMD shill posts that this post will be buried under within hours :P

    Oh well, any actual readers should wait for a DigitalFoundry review whenever that happens to come out if you want useful game testing results anyways.
    Reply
  • Gothmoth - Monday, June 19, 2017 - link

    i don´t care about powerdraw that much if i can COOL the CPU and keep the cooling quiet.

    but in this case the powerdraw is high and the heat is crazy.

    and all because of intel insisting to save a few dollar on a 1000 dollar CPU and use TIM?

    WTF....
    Reply
  • Ej24 - Monday, June 19, 2017 - link

    I wish amd would have released Threadripper closer to ryzen. That way amd wouldn't make comparisons of ryzen to Intel x99/x299. They kind of shot themselves in the foot. AM4 is only directly comparable to lga115x as a platform. R3, 5 and 7 are only really intended to compete with i3, 5, and 7 consumer parts. Amd simply doubled the core count per dollar at the consumer line. It's merely coincidental at this point that ryzen core count lines up with Intel HEDT. The platforms are not comparablein use case or intent. All these comparisons will be null when Threadripper/x399 is released as that is AMD's answer to x299. Reply
  • Ej24 - Monday, June 19, 2017 - link

    how is the 7740x, 112w tdp only drawing 80w at full load? I understand that tdp isn't power draw but thermal dissipation. However the two values are usually quite close. In my experience, max turbo power consumption surpasses the tdp rating in watts.
    For example, my 88w tdp 4790k consumes 130w at max 4 core turbo. My 4790S a 65w tdp consumes 80w at max 4 core turbo. My 4790t, 45w tdp, consumes 55w at max 4 core turbo. So how is it the 7740x consumed 80W at max utilization??
    Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Agreed as on http://www.anandtech.com/show/10337/the-intel-broa... the all-core load for the i7 6950X the all-core load is 135W yet on this graph it's 110W. Something is wrong with those load numbers. Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    It's consumer silicon running a single notch up the voltage/frequency curve. Probably binned a bit better too. 112W is just a guide to make sure you put a stonking great big cooler on it. But given the efficiency we saw with Kaby Lake-S processors to begin with, it's not that ludicrous. Reply
  • Flying Aardvark - Monday, June 19, 2017 - link

    This is an interesting time (finally), again in CPUs. To answer the question you posed, "Ultimately a user can decide the following". I decided to go mini-ITX this time. Chose Ryzen for this, and initially the 1800X. Had to downgrade to the 1700 due to heat/temps, but overall I don't think anything competes against AMD at all in the Node202 today.

    That's one area where Intel is MIA. Coffeelake will be 6C/12T, 7700K is 4C/8T. R7-1700 is 65W and 8C/16T. Works great. I paired mine with a 1TB 960 Pro and Geforce 1060 Founders Edition.

    If I moved to anything else, it would be all the way to 16C/32T Threadripper. I'm really unimpressed by this new Intel lineup, power consumption and heat are simply out of control. Dead on arrival.
    Reply
  • Gothmoth - Monday, June 19, 2017 - link

    what mobo and ram did you use? is your ryzen build really stable?

    i need full load stability 24/7.
    Reply
  • Flying Aardvark - Monday, June 19, 2017 - link

    What, you don't need just 60% stability? Yes it's stable.

    I did have one bluescreen and it was the Nvidia driver. I think it's unlikely most people would run into whatever caused it, because I use a triple monitor setup and lots of programs / input switching, and it crashed upon a DisplayPort redetection.

    I bought the Geforce 1060 because it was the most efficient and well-built blower fan cooled GPU I could find. But buying again, I'd go for the best Radeon 480/580 that I could find.

    I never had a bluescreen for decade running Intel CPUs and AMD GPUs so I dislike changing to AMD CPUs and Nvidia GPUs.. but I think it's safest to run a Radeon. Just less likely to have an issue IMO.
    Other than that, no problems at all. Rock solid stable. I used the Biostar board and G.Skill "Ryzen" RAM kit.
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    it´s something different if as system is stable for 2-3 hours under load or 24/7 under load.. capiche? :-) Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    btw... thanks for your answer.

    i use a triple monitor setup and use many programs at once... what sense would a 8-10 core make otherwise. :-)
    Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    The Chill feature for AMD would probably be amazing for a mini-ITX build. Still waiting on AMD to launch dedicated graphics cards for laptops with it, would be amazing.

    I hope Nvidia comes up with a similar feature, would make gaming on laptops a lot nicer/quieter.
    Reply
  • rocky12345 - Monday, June 19, 2017 - link

    "As the first new serious entry into the HEDT space for AMD in almost five years, along with a new x86 core, AMD offered similar-ish performance to Broadwell-E in many aspects (within a few percent), but at half the price or better."

    Half right it was AMD's first serious entry into High end Mainstream Desktop in five years. We have not seem AMD's HEDT platform in action yet since it has not been released yet. With that said I was surprised how AMD's current Mainstream R7's were able to compete with Intels new HEDT platform so well. If this is what we are to expect from AMD's actual HEDT platform Intel will have a fight on their hands for sure.

    I hope when AMD releases their Threadripper platform we get just as an extensive review for that as well. To me it is a lot more exciting to see x399 and threadripper in action than the x299 since Intel has been doing their HEDT platform for many years now and this will be the first time AMD has entered into the same Extreme high end space as Intel at the consumer level hardware.
    Reply
  • Manch - Tuesday, June 20, 2017 - link

    HEMD? LOL

    Ryzen is a great chip no doubt. AMD is once again pushing Intel to iterate vs stagnate. AMD offers a great price/perf proc. But, AMD brought the comparison on themselves. Most previews were not against a 6700K or 7700K but against Intel's HEDT procs which AMD was clearly targeting with the pre-release benchmark comparos. It is not unfair to compare them.
    Reply
  • Braincruser - Monday, June 19, 2017 - link

    Any particular reason the mainstream intel processors are not in the benchmarks? One of the more important measurements for me is the difference between the mainstream and high end platforms. Reply
  • Ryan Smith - Monday, June 19, 2017 - link

    While we technically have infinite space, we try not to overload the graphs with too many products, focusing on new products and certain generational comparisons. For specific comparisons you'd like to see that aren't in a graph, you can find all of that over in Bench.

    http://www.anandtech.com/bench/
    Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Thanks for the link, I didn't know you guys had that. Reply
  • none12345 - Monday, June 19, 2017 - link

    Small correction for your article, ryzen 7 has 20 pci lanes not 16. I am NOT counting 4 more for the chipset, if you count those its 24. Youve got 16 to the gpu, and 4 to a direct connected m.2(or 2 sata ports, but usually its a m.2)

    Tho thats not the whole story. Since technically you dont need the chipset, so you could connect something else instead on those 4 lanes. In the real world tho almost every motherboard will have a chipset there.

    And then there are the usb ports which have direct connections on the chip as well. Thats could be counted as more pci lanes if one wanted to. They dont share the chipset link.

    So depending on how you count it, its 20, 24, or 28 lanes. I would call it 20 and not count the usb or the chipset ones, but id definitly count the direct connected m.2
    Reply
  • Nintendo Maniac 64 - Monday, June 19, 2017 - link

    Uhhh, where's the results for Dolphin? It's on the "clock-for-clock" page but not on the "CPU system tests" page... Reply
  • Ro_Ja - Monday, June 19, 2017 - link

    It's nice to see the Ryzen Chips keeping up with i9s. Reply
  • Hurr Durr - Tuesday, June 20, 2017 - link

    Except they don`t, hence the price difference. Reply
  • Icehawk - Monday, June 19, 2017 - link

    I'm curious to see AMD's response w/Threadripper but the 7820 is looking like the next CPU for me, I'm still on a 3770k and while it is just fine for gaming I've been converting all of my media to x265 and need a lot more muscle to speed that along. I do wish that heat was better controlled as my current system is near silent with a fanless PSU, an AiO water cooler, no mech drives, etc and overclocked to 4.4ghz. Reply
  • Pieter123456 - Monday, June 19, 2017 - link

    Funny how you did not hold of on making a verdict when there were bios and 1080p gaming issues with ryzen launch??! Reply
  • Flying Aardvark - Monday, June 19, 2017 - link

    Intel pays well for such courtesies. AMD, not so much. I heard people saying Ryzen was released defective.
    So the conclusion that AT should be making is that the new i7 and i9 lineup was released as defective.
    Reply
  • tamalero - Monday, June 19, 2017 - link

    I heard you, I still remember how fanboys and some "socialites" of the hardware sites bashed AMD nonstop on other issues, like the voltage issues on the Polaris chips.

    Nvidia has a problem too, and they arent even close to be that harsh.
    Ryzen's cache issue? THE WORLD IS ENDING FOR AMD!!!
    Intel's drop in performance core per core in some things? "not that bad, balance, etc..etc.."
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    the biggest issue is this insane heat skylake-x produces when overclocked.

    the bios issues can be resolved.
    that anandtech is more biased towards intel should come as no surprise.
    Reply
  • tamalero - Tuesday, June 20, 2017 - link

    I noticed that, there is a huge difference in the wording on this article vs the one in tomshardware.

    And both show very different outcomes.
    Specially in overclocking and power consumption. Where the corei9 is ridiculously inefficient.
    Reply
  • Luckz - Tuesday, June 20, 2017 - link

    "there is a huge difference in the wording"

    And knowing Tomshardware from decades ago, I would have expected the exact opposite.
    Reply
  • Ryan Smith - Tuesday, June 20, 2017 - link

    To be clear here, our Ryzen article didn't have any gaming coverage either.

    http://www.anandtech.com/show/11170/the-amd-zen-an...

    That launch was based solely on desktop usage & compute, in big part because there were so many weird things going on with gaming. In this article we actually went one step further by specifically recommending that gamers not buy SKL-X for the time being.
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    but no really critical word about the crazy temps.

    how i am supposed to cool this cpu on air?

    and no real critic that intel uses stupid thermal paste..... overall the critical stuff like crippled PCI lanes need to be adressed more agressive.
    Reply
  • tamalero - Monday, June 19, 2017 - link

    The TDP differences are insane! Reply
  • Tuna-Fish - Tuesday, June 20, 2017 - link

    Just a tiny nitpick about the cache hierarchy table:

    TLBs are grouped with cache levels, that is, L1TLBs are with the L1 caches and the L2 TLB is with the L2 cache, as if the level of TLB is associated with the level of cache. This is not how they work -- any request only has to have it's address translated once, when it's loaded from the L1 cache. If there is a miss when accessing the L1 TLB, the L2 TLB is accessed before the L1 cache is.
    Reply
  • PeterCordes - Monday, July 03, 2017 - link

    This common mistake bugs me too! The transistors for the TLB's 2nd level are probably not even near the L2 cache. (And the L2 cache is physically indexed / physically tagged, so it doesn't care about translations or virtual addresses at all). The multi-level TLB is a separate hierarchy from the normal caches.

    I also commented earlier to point out several other errors in [the uarch details](http://www.anandtech.com/comments/11550/the-intel-... e.g. mixing up the register-file sizes with the scheduler size.
    Reply
  • yeeeeman - Tuesday, June 20, 2017 - link

    What this review shows just how good of a deal AMD Ryzen CPUs are. I mean, R7 1700 is like 300$ and it keeps up in many of the tests with the big boys from Intel. Reply
  • Carmen00 - Tuesday, June 20, 2017 - link

    Small typo on the first page, Ian: "For $60 less than the price of the Core i7-7800X...". But the comparison shows $389 vs $299, which is a $90 difference. Otherwise a fantastic, in-depth review, thank you very much! Reply
  • Ian Cutress - Tuesday, June 20, 2017 - link

    Official MSRPs haven't changed. What distributors do with their stock is a different story. Reply
  • Carmen00 - Wednesday, June 21, 2017 - link

    I'm talking about the MSRPs. There's a table ("Comparison: Core i7-7800X vs. Ryzen 7 1700") on Page 1 with the MSRPs as $299 and $389, a $90 difference. The text just above this table says that there's a $60 difference, but 389-299=90, not 60. So either the text is incorrect, or the MSRPs in the table are incorrect. Reply
  • Tephereth - Tuesday, June 20, 2017 - link

    Missing temps and in-game benchmarks... u're the only one in the whole web that has an 7800x to test, so please post those :( Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    after reading a dozend reviews i say:

    great now we have the choice between two buggy platforms.... well done.
    i am not going to be a bios betatester for AMD or Intel.

    these two release are the worst in many years i would say.

    i hope AMD has threadripper ironed out.
    Reply
  • AntDX316 - Tuesday, June 20, 2017 - link

    The new processors are in totally another level/league/class. It dominates in everything and more except a couple benches. Reply
  • AnandTechReader2017 - Tuesday, June 20, 2017 - link

    Of course they are, Ryzen is mainstream, Thread Ripper is the competitor.
    Thread Ripper will be quite interesting, the scaling of the "infinity fabric" will come to the fore and show if AMD's new architecture is a worthy competitor.
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    especially in the powerdraw and heat class it dominates even my oven.... Reply
  • AntDX316 - Tuesday, June 20, 2017 - link

    The new processors are in totally another level/league/class. It dominates in everything and more except a couple benches. If you try to compare by price you can't. It would make no sense. Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    it makes sense.. for everyone except stupid fanboys. Reply
  • Hxx - Tuesday, June 20, 2017 - link

    so people are getting pissed because these CPUs perform well albet at a much higher price tag. Sounds like a bunch of AMD fanboys.
    Its new tech, highest performing, and serving a very niche market. Of course is at a premium price. why wouldn't it be? Luckily, these are not needed for the majority so I am not sure why people get so worked up about it. If you want intel then 7700k is a fantasic $300 CPU. If you want AMD then again the 1700 is also a fantastic CPU. end of story
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    i personally don´t care about price... my PC´s earn the money back i spend.

    but saying we can´t compare price/performance form skylake-x and ryzen is just plain stupid.
    of course we can.

    and we can also compare price/performance when threadripper is released... the real competition to x299.
    Reply
  • Gothmoth - Tuesday, June 20, 2017 - link

    by the way.. what about the RUMORS that coffee lake will be 6 cores... but no hyperthreading?

    that would be EXACVTLY what i expect from intel.
    Reply
  • Intredpid3d - Tuesday, June 20, 2017 - link

    Why did you use Intel's compiler for your reviews? with all the other compilers out there that work beautifully with Ryzen why did you use the one that is known to be deliberately coded to work very badly on anything other than its creators products, Intel.

    ?
    Reply
  • johnp_ - Tuesday, June 20, 2017 - link

    Were did they use Intel's icc? The Chromium compile test was done using VS Community 2015.3. Reply
  • tamalero - Tuesday, June 20, 2017 - link

    Thats the interesting thing, in a lot of reviews online.. there are 3 variations of the same test.

    1) stock I9 7900
    2) optimized compiler I9 7900
    3) Oced I9 7900

    the optimized one yields like 10% higher on average more performance.
    Reply
  • johnp_ - Tuesday, June 20, 2017 - link

    2) is not possibly for a Chromium compile test, as that has a hard dependency on Visual Studio 2015 U3 and 3) first requires overclocking, for which they didn't have enough time yet.

    Regarding higher performance, I expect you mean compiled programs reaching higher performance and not the compilation process requiring less time (which is what anandtech measures here and that's the relevant bit for developers).
    Reply
  • Tephereth - Tuesday, June 20, 2017 - link

    "For each of the GPUs in our testing, these games (at each resolution/setting combination) are run four times each, with outliers discarded. Average frame rates, 99th percentiles and 'Time Under x FPS' data is sorted, and the raw data is archived."

    So... where the hell are the games benchmarks in this review?
    Reply
  • beck2050 - Tuesday, June 20, 2017 - link

    The possibility of the 18 core beast in the upcoming Mac Pro is really exciting for music pros.
    That is a tremendous and long overdue leap for power users.
    Reply
  • drajitshnew - Tuesday, June 20, 2017 - link

    "... and only three PCIe 3.0 x4 drives can use the in-built PCIe RAID"
    I would like to know which raid level you would use. I can't see 3 m2 drives in raid 1, and raid 5 would require access to the cpu for parity calculations. Then raid 0 it is. Now, which drives will you use for raid 0, which do not saturate the DMI link for sequential reads? And if your workload does not have predominantly sequential reads, then why are you putting the drives in raid.
    Reply
  • PeterCordes - Tuesday, June 20, 2017 - link

    Standard motherboard RAID controllers are software raid anyway, where the OS drivers queue up writes to each drive separately, instead of sending the data once over the PCIe bus to a hardware RAID controller which queues writes to two drives.

    What makes it a "raid controller" is that you can boot from it, thanks to BIOS support. Otherwise it's not much different from Linux or Windows pure-software RAID.

    If the drivers choose to implement RAID5, that can give you redundancy on 3 drives with the capacity of 2.

    However, RAID5 on 3 disks is not the most efficient way. A RAID implementation can get the same redundancy by just storing two copies of every block, instead of generating parity. That avoids a ton of RAID5 performance problems, and saves CPU time. Linux md software RAID implements this as RAID10. e.g. RAID10f2 stores 2 copies of every block, striped across as many disks as you have. It works very well with 3 disks. See for example https://serverfault.com/questions/139022/explain-m...

    IDK if Intel's mobo RAID controllers support anything like that or not. I don't use the BIOS to configure my RAID; I just put a boot partition on each disk separately and manage everything from within Linux. IDK if other OSes have soft-raid that supports anything similar either.

    > And if your workload does not have predominantly sequential reads, then why are you putting the drives in raid.

    That's a silly question. RAID0, RAID1, and RAID5 over 3 disks should all have 3x the random read throughput of a single disk, at least for high queue depths, since each disk will only see about 1/3rd of the reads. RAID0 similarly has 3x random write throughput.

    RAID10n2 of 3 disks can have better random write throughput than a single disk, but RAID5 is much worse. RAID1 of course mirrors all the writes to all the disks, so it's a wash for writes. (But can still gain for mixed read and write workloads, since the reads can be distributed among the disks).
    Reply
  • Lieutenant Tofu - Tuesday, June 20, 2017 - link

    I wonder why 1600X outperforms 1800X here on WebXPRT. It's not a huge difference, but I don't see why it's happening. 6-core vs. 8-core, 3.6 GHz base, 4.0 GHz turbo. This presumably runs in just one thread, so performance should be nearly identical. The only reason I can think of is less contention across the IF on the 1600X due to less enabled cores, but don't see that having a major effect on a single-threaded test like this one.

    Maybe 1600X can XFR to a little higher than the 1800X.
    Reply
  • Eyered - Tuesday, June 20, 2017 - link

    Did they have any issues with heat at all? Reply
  • mat9v - Tuesday, June 20, 2017 - link

    If that were so everyone would be using HEDT instead of 4c/8t CPUs Reply
  • mat9v - Tuesday, June 20, 2017 - link

    Then why again why aren't every workstation consist of dual cpu xeons? If the expense is so insignificant compared to how much faster machine will earn... Reply
  • mat9v - Tuesday, June 20, 2017 - link

    I'm just wondering how did 7900X menage to stay within 140W bracket during Prome95 tests when in other reviews it easily reached 250W or more. Is it some internal throttling mechanism that keeps CPU constantly dynamically underclocked to stay within power envelope? How does such compare to forced 4Ghz CPU clock? Reply
  • mat9v - Tuesday, June 20, 2017 - link

    And yet in conclusion you say to play it safe and get 7900X ?
    How does that work together?
    Reply
  • mat9v - Tuesday, June 20, 2017 - link

    To play it safe, invest in the Core i9-7900X today.
    To play it safe and get a big GPU, save $400 and invest in the Core i7-7820X today.

    Then the conclusion should have been - wait for fixed platform. I'm not even suggesting choosing Ryzen as it performs slower but encouraging buying flawed (for now) platform?
    Reply
  • mat9v - Tuesday, June 20, 2017 - link

    Please then correct tables on 1st page comparing Ryzen and 7820X and 7800X to state that Intel has 24 lines as they leave 24 for PCIEx slots and 4 is reserved for DMI 3.0
    If you strip Ryzen lines to only show those available for PCIEx do so for Intel too.
    Reply
  • Ryan Smith - Wednesday, June 21, 2017 - link

    The tables are correct. The i7 7800 series have 28 PCIe lanes from the CPU for general use, and another 4 DMI lanes for the chipset. Reply
  • PeterCordes - Tuesday, June 20, 2017 - link

    Nice article, thanks for the details on the microarchitectural changes, especially to execution units and cache. This explains memory bandwidth vs. working-set size results I observed a couple months ago on Google Compute Engine's Skylake-Xeon VMs with ~55MB of L3: The L2-L3 transition was well beyond 256kB. I had assumed Intel wouldn't use a different L3 cache design for SKX vs. SKL, but large L2 doesn't make much sense with an inclusive L3 of 2 or 2.5MB per core.

    Anyway, some corrections for page3: The allocation queue (IDQ) is in Skylake-S is always 64 uops, with or without HT. For example, I looked at the `lsd.uops` performance counter in a loop with 97 uops on my i7-6700k. For 97 billion counts of uops_issued.any, I got exactly 0 counts of lsd.uops, with the system otherwise idle. (And I looked at cpu_clk_unhalted.one_thread_active to make sure it was really operating in non-HT mode the majority of the time it was executing.) Also, IIRC, Intel's optimization manual explicitly states that the IDQ is always 64 entries in Skylake.

    The scheduler (aka RS or Reservation Station) is 97 unfused-domain uops in Skylake, up from 60 in Haswell. The 180int / 168fp numbers you give are the int / fp register-file sizes. They are sized more like the ROB (224 fused-domain uops, up from 192 in Haswell), not the scheduler, since like the ROB, they have to hold onto values until retirement, not just until execution. See also http://blog.stuffedcow.net/2013/05/measuring-rob-c... for when the PRF size vs. the ROB is the limit on the out-of-order window. See also http://www.realworldtech.com/haswell-cpu/6/ for a nice block diagram of the whole pipeline.

    SKL-S DIVPS *latency* is 11 cycles, not 3. The *throughput* is one per 3 cycles for 128-bit vectors, or one per 5 cycles for 256b vectors, according to Agner Fog's table. I forget if I've tested that myself. So are you saying that SKL-SP has one per 5 cycle throughput for 128-bit vectors? What's the throughput for 256b and 512b vectors?

    -----

    It's really confusing the way you keep saying "AVX unit" or "AVX-512 unit" when I think you mean "512b FMA unit". It sounds like vector-integer, shuffle, and pretty much everything other than FMA will have true 512b execution units. If that's correct, then video codecs like x264/x265 should run the same on LCC vs. HCC silicon (other than differences in mesh interconnect latency), because they're integer-only, not using any vector-FP multiply/add/FMA.

    -------

    > This should allow programmers to separate control flow from data flow...

    SIMD conditional operations without AVX512 are already done branchlessly (I think that's what you mean by separate from control-flow) by masking the input and/or output. e.g. to conditionally add some elements of a vector, AND the input with a vector of all-one or all-zero elements (as produced by CMPPS or PGMPEQD, for example). Adding all-zeros is a no-op (the additive identity).

    Mask registers and support for doing it as part of another operation makes it much more efficient, potentially making it a win to vectorize things that otherwise wouldn't be. But it's not a new capability; you can do the same thing with boolean vectors and SSE/AVX VPBLENDVPS.
    Reply
  • PeterCordes - Tuesday, June 20, 2017 - link

    Speed Shift / Hardware P-State is not Windows-specific, but this article kind of reads as if it is.

    Your article doesn't mention any other OSes, so nothing it says is actually wrong: I'm sure it did require Intel's collaboration with MS to get support into Win10. The bullet-point in the image that says "Collaboration between Intel and Microsoft specifically for W10 + Skylake" may be going too far, though. That definitely implies that it only works on Win10, which is incorrect.

    Linux has supported it for a while. "HWP enabled" in your kernel log means the kernel has handed off P-state selection to the hardware. (Since Linux is open-source, Intel contributed most of the code for this through the regular channels, like they do for lots of other drivers.)

    dmesg | grep intel_pstate
    [ 1.040265] intel_pstate: Intel P-state driver initializing
    [ 1.040924] intel_pstate: HWP enabled

    The hardware exposes a knob that controls the tradeoff between power and performance, called Energy Performance Preference or EPP. Len Brown@Intel's Linux patch notes give a pretty good description of it (and how it's different from a similar knob for controlling turbo usage in previous uarches), as well as describing how to use it from Linux. https://patchwork.kernel.org/patch/9723427/.

    # CPU features related to HWP, on an i7-6700k running Linux 4.11 on bare metal
    fgrep -m1 flags /proc/cpuinfo | grep -o 'hwp[_a-z]*'
    hwp
    hwp_notify
    hwp_act_window
    hwp_epp

    I find the simplest way to see what speed your cores are running is to just `grep MHz /proc/cpuinfo`. (It does accurately reflect the current situation; Linux finds out what the hardware is actually doing).

    IDK about OS X support, but I assume Apple has got it sorted out by now, almost 2 years after SKL launch.
    Reply
  • Arbie - Wednesday, June 21, 2017 - link

    There are folks for whom every last compute cycle really matters to their job. They have to buy the technical best. If that's Intel, so be it.

    For those dealing more with 'want' than 'need', a lot of this debate misses an important fact. The only reason Intel is suddenly vomiting cores, defecating feature sizes, and pre-announcing more lakes than Wisonsin is... AMD. Despite its chronic financial weakness that company has, incredibly, come from waaaay behind and given us real competition again. In this ultra-high stakes investment game, can they do that twice? Maybe not. And Intel has shown us what to expect if they have no competitor. In this limited-supplier market it's not just about who has the hottest product - it's also about whom we should reward with our money, and about keeping vital players in the game.

    I suggest - if you can, buy AMD. They have earned our support and it's in our best interests to do so. I've always gone with Intel but have lately come to see this bigger picture. It motivated me to buy an 1800X and I will also buy Vega.
    Reply
  • Rabnor - Wednesday, June 21, 2017 - link

    To play it safe and get a big GPU, save $400 and invest in the Core i7-7820X today.
    You have to spend that $400+ on a good motherboard & aio cooler.
    Are you sold by Intel, anandtech?
    Reply
  • Synviks - Thursday, June 22, 2017 - link

    For some extra comparison: running Cinebench R15 on my 14c 2.7ghz Haswell Xeon, with turbo to 3ghz on all cores, my score is 2010.

    Pretty impressive performance gain if they can shave off 4 cores and end up with higher performance.
    Reply
  • Pri - Thursday, June 22, 2017 - link

    On the first page you wrote this:
    Similarly, the 6-core Core i7-7820X at $599 goes up against the 8-core $499 Ryzen 7 1800X.

    The Core i7 7820X was mistakenly written as a 6-core processor when it is in-fact an 8-core processor.

    Kind Regards.
    Reply
  • Gigabytes - Thursday, June 22, 2017 - link

    Okay, here is what I learned from this article. Gaming performance sucks and you will be able to cook a pizza inside your case. Did I miss anything?

    Oh, one thing missing.

    Play it SMART and wait to see the Ripper in action before buy your new Intel toaster oven.
    Reply
  • Soheil - Sunday, June 25, 2017 - link

    Anyone knows why 1600X better than 1800X? Reply
  • ehfield7 - Thursday, June 29, 2017 - link

    I'm late here as usual but why are you not comparing against the 7700k and 7600k? I get that these are HEDT chips, but it's worth comparing against the high end mainstream especially when the 7800x and 7700k are priced similarly that someone MIGHT consider jumping over.

    I hate to say it but this is the typical stuff you guys used to do, and I know it takes more time to put together more CPUs, but logical comparisons MUST be made and these charts show a bit of laziness.
    Reply
  • ashlol - Friday, June 30, 2017 - link

    can we have the GPU tests please Reply
  • Oxford Guy - Saturday, July 01, 2017 - link

    "The discussion on whether Intel should be offering a standard goopy TIM or the indium-tin solder that they used to (and AMD uses) is one I’ve run on AnandTech before, but there’s a really good guide from Roman Hartung, who overclocks by the name der8auer. I’m trying to get him to agree to post it on AnandTech with SKL-X updates so we can discuss it here, but it really is some nice research. You can find the guide over at http://overclocking.guide."

    If you have a point to make then make it. After all, you said you've already "run" this discussion before. Tell us why polymer TIM is a better choice than solder (preferably without citing cracks from liquid nitrogen cooling).
    Reply
  • ashlol - Monday, July 03, 2017 - link

    Anyway both are bad since you have to delid it to achieve good cooling. I have delidded a 4770k and a 6700k and put liquid metal TIM between the die and the IHS and they both run 15°C cooler at 4.6-4.7GHz@60°C with custom loop. And from seeing the temperature under overclock I will have to delid those skylake-x too. Reply
  • parlinone - Tuesday, July 04, 2017 - link

    What I find most shocking is a $329 Ryzen 1700 outperforms a $389 7800X at Cinebench...for less than half the power.

    The performance to power ratio translates to 239% in AMD's advantage. That's unprecedented, and I never imagined to see that day.
    Reply
  • dwade123 - Thursday, July 06, 2017 - link

    Only in Cinebench and AES is where Ryzen look good. 7800x beats the 1800x in everything else in this review. Ryzen is too inconsistent in both productivity and gaming. It is priced accordingly to that and not out of good faith from AMD. This is also the reason why Coffee Lake will only top out at 6 cores. because it can consistently beat the best Ryzen model. Reply
  • IGTrading - Friday, July 14, 2017 - link

    I absolutely disagree with the conclusion. The correct conclusion can only drawn when comparin apples to apples. Oh, if you want to be objective and compare apples to oranges, you can't just take into considerantion today's benchmark results and price. Have we forgotten about the days we REVIEWERS were complaining about the high power consumption of Pentium 4 and Pentium D ?! What about the FX 8350 ?! Is power consumption not an objective metric anymore?! What about platform price ?! What about price/performance?! Why do some people get suddenly blinded by marketing money?! Conclusion: i-7900X is the highest performance the home power user can get today if money for the CPU , mobo and subsequent power consumption are not an issue. Comparing apples to apples or core for core, the i7820X clearly shows Intel's anxiety with Zen. The i7820X consumes 40% more than the AMD 1800X and costs 20% more while its motherboard is 200% the price. So paying all these heaps of money, CORE for CORE the Intel 7820X is a bit faster in some benchmarks, as it should be considering the power consumption and price you pay, EQUAL in a few benchmarks and SLOWER in a few other benchmarks. Would you pay the serious extra money for this ?! And put up with the 40% higher power consumption and heat generation ?! Come ooooon ... Reply
  • azulon1 - Sunday, July 16, 2017 - link

    Wow how exactly is this fair that Intel gets a pass for gaming, because there were problems with the problem with the platform. If I remember Rison also had a problem with gaming. But it didn’t stop you guys then did it. don’t group me into AMD fanboy, But why such a bais? Reply
  • Soheil - Saturday, July 29, 2017 - link

    no one answer to me? why 1600X better than 1700 and 1700X and some time better than 1800X?
    what about 1600? is good like as 1600X for gaming or not?
    Reply
  • dstephens80 - Monday, August 14, 2017 - link

    All, I have come across something interesting and wondering if it is only me. I just received my 7820x and was playing around with overclocking and I have to question Intel's claim that the CPUs are "Fully Unlocked". Using an Asus Strix-E X299 MB I adjusted my overclock to 4.6Ghz and then booted successfully and started my stress testing. I noticed my clock speed was bouncing between 4.3 and 4.6 so I thought maybe speedstep was interfering and went into BIOS and turned off SpeedStep, TurboBoost and C-states. When I booted back up I received an error for the TurboBoost utility (expected) but my speed was at the stock 3.6Ghz and the Intel Extreme Utility showed the same but also showed my multiplier should be set at 46. I went back into BIOS and enabled "TurboBoost" and upon reboot CPUz/Intel utility both showed speed at 4.6Ghz. My issue with the "Fully Unlocked" claim is that an OC should not be dependent on a software driver. I have confimed this by the fact that when I boot Linux the OC is not applied. Reply

Log in

Don't have an account? Sign up now