In a very short tweet posted to their Twitter feed yesterday, Intel revealed/confirmed the launch date for their first discrete GPU developed under the company’s new dGPU initiative. The otherwise unnamed high-end GPU will be launching in 2020, a short two to two-and-a-half years from now.

The tweet was posted amidst reports that Intel had given the same details to a small group of analysts last week, with the tweet being released to confirm those reports. The nature of the meeting itself hasn’t been disclosed, but Intel regularly gives analysts extremely broad timelines for new technologies as part of outlining their plans to remain on top of the market.

This new GPU would be the first GPU to come out of Intel’s revitalized GPU efforts, which kicked into high gear at the end of 2017 with the hiring of former AMD and Apple GPU boss Raja Koduri. Intel of course is in the midst of watching sometimes-ally and sometimes-rival NVIDIA grow at a nearly absurd pace thanks to the machine learning boom, so Intel’s third shot at dGPUs is ultimately an effort to establish themselves in a market for accelerators that is no longer niche but is increasingly splitting off customers who previously would have relied entirely on Intel CPUs.

Interestingly, a 2020 launch date for the new discrete GPU is inside the estimate window we had seen for the project. But the long development cycle for a high-end GPU means that this project was undoubtedly started before Raja Koduri joined Intel in late 2017 – most likely it would have needed to kick off at the start of the year, if not in 2016 – so this implies that Koduri has indeed inherited an existing Intel project, rather than starting from scratch. Whether this is an evolution of Intel’s Gen GPU or an entirely new architecture remains to be seen, as there are good arguments for both sides.

Intel isn’t saying anything else about the GPU at this time. Though we do know from Intel’s statements when they hired Koduri that they’re starting with high-end GPUs, a fitting choice given the accelerator market Intel is going after. This GPU is almost certainly aimed at compute users first and foremost – especially if Intel adopts a bleeding edge-like strategy that AMD and NVIDIA have started to favor – but Intel’s dGPU efforts are not entirely focused on professionals. Intel has also confirmed that they want to go after the gaming market as well, though what that would entail – and when – is another question entirely.

Source: Intel

Comments Locked


View All Comments

  • HStewart - Wednesday, June 13, 2018 - link

    Intel could support both FreeSync and GSync and let AMD and NVidia fight - any case Intel will work with the winner.
  • Diji1 - Thursday, June 14, 2018 - link

    I don't think that will happen.

    G-sync is only used on high end displays, that are developed with Nvidia's input, presumably partly because they don't want the Nvidia branding on bad displays. So there's little reason for them to want another entities GPU's selling G-sync capable cards given that it equals a lost sale of their own GPU.
  • peevee - Thursday, June 14, 2018 - link

    "Intel could support both FreeSync and GSync "

    Not without paying NVidia for G-sync licenses, which would be superidiotic (and even incurring extra develeopment/QA/support price on G-sync would be stupid too).
  • The Hardcard - Wednesday, June 13, 2018 - link

    Existing project, or maybe relying on existing technologies. They have already laid out their iGPU on the latest process. If theey were satisfied with the new technologies added to that, it wouldn’t take long to maybe double or quad it then tune the transistors to handle more power.

    Alternatively, Koduri could have started driving this before he hired on, maybe offering guidance and perspective while Kaby-G was being developed.
  • HStewart - Wednesday, June 13, 2018 - link

    Think of this way, when Koduri was working on Kaby-G and work on EMIB, he notice the real potential of this technology and wanted to be involved with Intel - of course I would not doubt that they talk about the potential while working on the project,

    I believe EMIB has tons of potential in this computer industry - even beyond the GPU which it already made - I would not doubt we will see EMIB on desktop computers in the future.

    Kaby G was just a test bed for technology.
  • coder543 - Wednesday, June 13, 2018 - link

    EMIB is just another name for Multichip Module (MCM) which AMD is already employing in all of their current processors. We already see it on desktop computers today, so why wait until the future?
  • coder543 - Wednesday, June 13, 2018 - link

    Also worth mentioning that AMD's Navi GPU architecture is rumored to be another one of AMD's highly scalable MCM designs, letting them build enormous GPUs with very high yields. If Koduri were excited about EMIB/MCM, then it wouldn't really matter where he works. I'm sure *that* was not his motivation for working for Intel.
  • HStewart - Wednesday, June 13, 2018 - link

    I am sure EMIB is part of it - because of Kaby G - experience - but also EMIB was developed before Koduri was involved with Kaby G project - so Intel didn't copy AMD - they made it more better and allow other company's (AMD) to work together.
  • HStewart - Wednesday, June 13, 2018 - link

    I don't believe AMD MCM can handle dies of difference process (nm - makers) where EMIB can handle it.

    AMD wants to believe they are the future - but Koduri new better and jump ship.
  • peevee - Wednesday, June 13, 2018 - link

    Just in time their 10nm will get decent yields.
    Everybody else will be on 5nm.

Log in

Don't have an account? Sign up now