In a very short tweet posted to their Twitter feed yesterday, Intel revealed/confirmed the launch date for their first discrete GPU developed under the company’s new dGPU initiative. The otherwise unnamed high-end GPU will be launching in 2020, a short two to two-and-a-half years from now.

The tweet was posted amidst reports that Intel had given the same details to a small group of analysts last week, with the tweet being released to confirm those reports. The nature of the meeting itself hasn’t been disclosed, but Intel regularly gives analysts extremely broad timelines for new technologies as part of outlining their plans to remain on top of the market.

This new GPU would be the first GPU to come out of Intel’s revitalized GPU efforts, which kicked into high gear at the end of 2017 with the hiring of former AMD and Apple GPU boss Raja Koduri. Intel of course is in the midst of watching sometimes-ally and sometimes-rival NVIDIA grow at a nearly absurd pace thanks to the machine learning boom, so Intel’s third shot at dGPUs is ultimately an effort to establish themselves in a market for accelerators that is no longer niche but is increasingly splitting off customers who previously would have relied entirely on Intel CPUs.

Interestingly, a 2020 launch date for the new discrete GPU is inside the estimate window we had seen for the project. But the long development cycle for a high-end GPU means that this project was undoubtedly started before Raja Koduri joined Intel in late 2017 – most likely it would have needed to kick off at the start of the year, if not in 2016 – so this implies that Koduri has indeed inherited an existing Intel project, rather than starting from scratch. Whether this is an evolution of Intel’s Gen GPU or an entirely new architecture remains to be seen, as there are good arguments for both sides.

Intel isn’t saying anything else about the GPU at this time. Though we do know from Intel’s statements when they hired Koduri that they’re starting with high-end GPUs, a fitting choice given the accelerator market Intel is going after. This GPU is almost certainly aimed at compute users first and foremost – especially if Intel adopts a bleeding edge-like strategy that AMD and NVIDIA have started to favor – but Intel’s dGPU efforts are not entirely focused on professionals. Intel has also confirmed that they want to go after the gaming market as well, though what that would entail – and when – is another question entirely.

Source: Intel

POST A COMMENT

56 Comments

View All Comments

  • Marlin1975 - Wednesday, June 13, 2018 - link

    I'll believe it when I see it. That and Intels history of really awful drivers will probably also come into play.
    So will look good on paper, be delayed if it ever comes, and then have awful drivers that never gets its full potential from it.
    Reply
  • PeachNCream - Wednesday, June 13, 2018 - link

    I haven't had a bad experience with Intel graphics drivers since the GMA 950. Those were truly awful, but from the 4500MHD and up, things have improved quite a bit. Most of the PC games I play run well on an Ivy Bridge HD 4000 so I rarely bother to send them to my laptop's Quadro NVS 5200m. There is a notable performance difference in some cases, but overall, Intel's drivers have been very good to me over the last decade. Reply
  • Kvaern1 - Wednesday, June 13, 2018 - link

    Hmm a few years ago I put together a Haswell based PC for a young family member and I had to get a dGPU for it because Minecraft graphics was completely messed up on the Intel GPU. Reply
  • PeachNCream - Thursday, June 14, 2018 - link

    I don't have any experience with Minecraft so I can't speak for how that'd work on any of Intel's GPUs. But I have used every Intel graphics chip released from i740 up to the HD 4000 with the exception of the HD 2000 and HD 2500 for gaming. There used to be stupid things you'd have to do to make stuff run properly, but Intel has come a long way since the dark days of the 915 and 950. Performance compared to dGPUs has always lagged with chipset or CPU-based graphics by a significant margin so setting reasonable expectations is an absolute necessity. I don't doubt there are still issues in other games beyond Minecraft that'd push the need for a different GPU as well, but the point remains that Intel's come a long, long way in driver quality over the last decade or so. I would give AMD a nod for their current iGPU being a better solution for gamers on a shoestring budget since the performance is better, but when you're not allocating a decent sum of money to your computer, you get what you get and sometimes that's an Intel GPU. Reply
  • BenSkywalker - Wednesday, June 13, 2018 - link

    Two to two and a half years out, 2020..... Can't make the math work. Reply
  • Ryan Smith - Wednesday, June 13, 2018 - link

    Current date: June, 2018
    Current date + 2 years: June, 2020
    Current date + 2.5 years: December, 2020

    (While it could conceivably arrive before June, let's be honest: the dev time required means anything is going to be H2'2020)
    Reply
  • jrs77 - Wednesday, June 13, 2018 - link

    I don't know if the current integrated graphics is scalable and/or to what degree. But imagine an IrisPro with double the EUs and a couple GB GDRR5. If it scales well, than you have a GPU at the level of a GTX1050 allready without inventing anything new. Reply
  • jeremyshaw - Wednesday, June 13, 2018 - link

    I don't believe Intel has a GDDR5 controller, but they do have some experience with HBM2 and its implementations. While the HBM2 controller wasn't Intel's, the implementation was theirs. HBM was already pioneered by their partner, Micron, before Hynix and AMD made the similar HBM. Reply
  • Eris_Floralia - Wednesday, June 13, 2018 - link

    Remember Larrabee? That thing has GDDR5 IMC. Reply
  • edzieba - Wednesday, June 13, 2018 - link

    Yep. Knights Corner (the last Knights die to retain on-die texture units) had GDDR5 controllers. Knights Landing switched to on-package HMC plus off-package DDR4, and this appears to be carried over to Knights Mill. I don't think Knights Hill has had its memory layout published publicly. Reply

Log in

Don't have an account? Sign up now