Final Words

Does This Make Intel and AMD the Best of Frenemies?

It does seem odd, Intel and AMD working together in a supplier-buyer scenario. Previous interactions between the two have been adversarial at best, or have required calls to lawyers at the worst. So who benefits from this relationship?

AMD: A GPU That Someone Else Sells? Sure, How Many Do You Need?

Some users might point to AMD’s financials as being a reason for this arrangement, in the event that Zen didn’t take off then this was a separate source of income for AMD. Ultimately AMD is looking healthier since Ryzen, and even if Intel did rock up with piles of money, the scope of the product is unclear how much volume Intel would be requesting.

Or some might state that this sort of product, if positioned correctly, would encroach into some of AMD’s markets, such as laptop APUs or laptop GPUs. My response to this is that it actually ends up a win for AMD: Intel is currently aiming at 65W/100W mobile devices, which is a way away from the Ryzen Mobile parts that should come into force during 2018. For every chip they sell to Intel, that’s a sale for them. It means that there discrete-class graphics in a system that might have had an NVIDIA product in it instead. One potential avenue is that NVIDIA’s laptop GPU program is extensive: now with Intel at the helm driving the finished product rather than AMD, there is scope for AMD-based graphics to appear in many more devices than if they went alone. People trust Intel on this, and have done for years: if it is marketed as an Intel product, it’s a win for AMD.

Intel: What Does Intel Get Out Of This?

Intel’s internal graphics, known as ‘Gen’ graphics externally, has been third best behind NVIDIA and AMD for grunt. It had trouble competing against ARM’s Mali in the low power space, and the scaling of the design has not seemed to lend itself to large, 250W GPUs for desktops. If you have been following certain analysts that keep tabs on Intel’s graphics, you might have read the potential woes and missed targets that have potentially happened behind closed doors every time there has been a node shrink. Even though Intel has competed with GT3/4 graphics with eDRAM in the past (known as the Crystalwell products), some of which performed well, they came at additional expense for the OEMs that used them.

So rather than scale Gen graphics out to something bigger, Intel worked with AMD to purchase Radeon Vega. It is unclear if Intel approached NVIDIA to try something similar, as NVIDIA is the bigger company, but AMD has a history of being required by one of Intel’s big OEM partners: Apple. AMD has also had a long term semi-custom silicon strategy in place, while NVIDIA does not advertise that part of their business as such.

What Intel gets is essentially a better version of their old Crystalwell products, albeit at a higher power consumption. The end product, Intel with Radeon RX Vega M Graphics, aim to offer other solutions (namely Intel + NVIDIA MX150/GTX1050) but with reduced board space, allowing for thinner/lighter designs or designs with more battery. A cynic might suggest that either way, it was always going to be an Intel sale, so why bother going to the effort? One of the tracks of Intel’s notebook products in recent years is trying to convince users to upgrade more frequently: for the last couple of years, users who buy 2-in-1s were found to refresh their units quicker than clamshell devices. Intel is trying to do the same thing here with a slightly higher class of product. Whether the investment to create such a product is worth it will bear out in sales numbers.

It's Not Completely Straightforward

One thing is clear though: Intel’s spokespersons that gave us our briefing were trained very specifically to avoid mentioning AMD by name about this product line. Every time I had expected them to say ‘AMD Graphics’ in our pre-briefing, they all said ‘Radeon’. As far as the official line goes, the graphics chip was purchased from ‘Radeon’, not from AMD. I can certainly understand trying to stay on brand message, and avoiding the name from an x86 competitive standpoint, but this product fires a shot across the bow of NVIDIA, not AMD. Call a spade a spade.

Aside from the three devices that will be coming with the new processors, from HP, Dell, and the Intel NUC, one interesting side story came out of this. Intel has already had interest from a cloud gaming company for these new processors. In the same way that a massive GPU based-datacenter can offer many users cloud gaming services, these new chips are set to be in the datacenter for 1080p gaming at super high density, perhaps moreso than current GPU solutions. An interesting thought.


Intel NUC Enthusiast 8: The Hades Canyon Platform

The HP and Dell units are set to be announced later this week during CES. For information about the Intel NUC, using the overclockable Core i7-8809G processor, Ganesh has the details in a separate news post.

8th Gen Gets More Complex
POST A COMMENT

67 Comments

View All Comments

  • mczak - Monday, January 8, 2018 - link

    FWIW apple has shipped plenty MBPs where the charger isn't quite sufficient. These will drain the battery a little even if plugged in when running at full tilt (and at least some of them also have the habit of running really really slow if the battery isn't just old but completely dead because they will be forced to low power states).
    Albeit I agree for a 89W charger a 100W cpu+gpu is probably too much, since together with the rest of the system that might amount to a sustained power draw of over 110W, which would drain the battery too fast. But if apple wants a 80W version of it, I'm pretty sure intel would just deliver that, those limits can be easily changed.
    Reply
  • Kevin G - Sunday, January 7, 2018 - link

    And MS SQL Server is available for Linux. I think hell has frozen over. Reply
  • tracker1 - Monday, January 8, 2018 - link

    For what it's worth, MS SQL Server on Linux/Docker is fairly limited, and the mgt software is still windows based, though you can do anything you need via sql execute statements... it's not the friendliest. I usually treat my database as mostly dumb anyway. Reply
  • Zingam - Sunday, January 7, 2018 - link

    But does it melt down? Reply
  • haukionkannel - Monday, January 8, 2018 - link

    Yes it does. Reply
  • B166ER - Sunday, January 7, 2018 - link

    I just don't get this marriage. It seems the graphics power is juuust a wee bit over intels cores, so what graphics need would this push? "Oh look I get 5 more fps in Minecraft!"?? Reply
  • schizoide - Sunday, January 7, 2018 - link

    I thought it would be faster than that, so i did some back of the napkin math. 24 CUs = 43% of a Vega 56, but 19% slower on the GPU and 50% slower on the HBM. Seems reasonable to guess it will offer about 25% the performance of a Vega 56.

    Vega56 gets 20k in 3DMark, 25% of that is 5k. The fastest iGPU I could find on futuremark's site is the 6700HQ, which scored 7910. So... it's slower than the fastest intel GPU from 2 years ago. Is that right?
    Reply
  • schizoide - Sunday, January 7, 2018 - link

    Yeah that wasn't right, futuremark switched it from GPU to CPU when I searched for Intel. The fastest GPU score I could find for an Intel iGPU was the Iris Pro 6200 from the Broadwell generation. It got 1630. Skylake improved the iGPU quite a lot but I can't find the benchmarks offhand. Reply
  • JohnPec - Monday, January 8, 2018 - link

    Linus said it will be as good as 1060maxq or better. Reply
  • tipoo - Sunday, January 7, 2018 - link

    Hm? It's definitely a fair shot over the Iris Plus 650, and the Pro line seems dead after the Pro 580. This will absolutely be 3-4x over an Iris Plus 650, let alone the eDRAM-less Iris HD 630 thrown in there.

    What did you mean by barely above the Intel part? I see nothing close.
    Reply

Log in

Don't have an account? Sign up now