Dennis Fong’s Raptr is a utility that has made a name for itself in the PC GPU space in a relatively short period of time. After pivoting off of their original designs to be a chat client, the company struck a deal with AMD in 2013 to become a quasi-second party GPU utility developer. As part of their AMD partnership, Raptr provided AMD with a branded version of their client (AMD Gaming Evolved App) and its game settings recommendation service, and more recently Raptr has added other GPU-centric features like GameDVR hardware accelerated game recording. The partnership has been fruitful for both companies, with AMD and their users gaining access to new software features and Raptr in turn getting promoted by AMD and included in AMD’s driver downloads.

Between Raptr/GEA and NVIDIA’s GeForce Experience, both of the major dGPU providers now offer game settings recommendations and game recording through their respective applications. In fact the only PC GPU that there isn’t a similar utility for is Intel’s iGPUs, and this week at GDC Intel and Raptr have revealed that this will be changing.

Being announced this week is a new partnership between Intel and Raptr that will see Raptr’s GPU features extended to support Intel’s GPUs. This means that Raptr will be gaining the ability to use Intel’s QuickSync encoder for their GameDVR feature, Raptr’s driver update checking & notifications, and Raptr’s settings recommendation service will soon be able to profile performance on and generate settings for Intel’s GPUs as well. This deal essentially brings Intel up to parity with AMD on the utility front, offering the same features as AMD through what is fundamentally the same software.

Unlike the AMD partnership however there are a few differences, especially at the hardware level. While AMD offers a complete range of GPUs, Intel only focuses on the low-end with integrated GPUs. Which means that for Intel, a good game settings recommendations service is especially important because the lower performance of their GPUs means games will often require extra tweaking to be made playable on their GPUs, and all the while there is no significant extra performance margin to absorb suboptimal settings. It’s important to keep in mind just how incredibly large the Intel HD Graphics user base is, which means that any improvements Intel invests in here will pay off in bulk, making the reward/effort ratio quite high.

More broadly speaking, Intel also benefits from the Raptr service since the performance recommendation service is a continually updating service, offering something Intel and game devs cannot. Intel actually already does performance profiling for new games, and Raptr will be getting this data as well in order to create their initial recommendations. However once put in motion, Raptr’s crowd source data collection mechanism means that in the future they can adapt to things like driver performance improvements and use that data to provide newer, better recommendations. And with Raptr offering driver update notifications (something Intel’s control panel does not), users will be more likely to have regularly updated drivers that offer the best performance for a given game.

Meanwhile from a business perspective there will also be a few important differences between the Intel and AMD deals. Unlike AMD, Intel will not be receiving a branded application (GEA) and instead Intel users will be directed to use the stock application. Intel will also not be bundling the application with their drivers like AMD does. What Intel will be doing instead is offering the Raptr client for download from their website – http://www.intel.com/raptr – and Intel will also be encouraging OEMs to bundle the Raptr client with their prebuilt systems. This partnership means that Raptr will have a much slower rollout on Intel systems, but if the OEMs go ahead and bundle it then in the long run the end-user uptake could be much higher.

Wrapping things up, as of this publication time Intel’s Raptr site has already gone live. Meanwhile Raptr users should see an increasing amount of support for Intel HD Graphics in the game settings service as the service continues to ramp up its data sets.

Comments Locked

33 Comments

View All Comments

  • Shadowmaster625 - Thursday, March 5, 2015 - link

    Like you need a 3rd party game settings utility just to set all the settings to LOW. Hahhahahhah.
  • WithoutWeakness - Thursday, March 5, 2015 - link

    The Raptr logo looks more like a chicken embryo than a raptor.
  • tynopik - Thursday, March 5, 2015 - link

    Repeatedly misspelling 'Raptr' as 'Ratpr', freudian slip?
  • Ian Cutress - Thursday, March 5, 2015 - link

    Adjusted. Ryan probably wrote this while running between meetings at GDC :)
  • MrSpadge - Thursday, March 5, 2015 - link

    If this can help casual users get a better gaming experience from their devices, that's certainly worth something.

    I just wish Intel would be a lot more agressive in pushing the use of their iGPUs. A simple step would be to adapt FreeSync. Next on the list I'd put "making better use of those pixels". Work with the game engine guys to implement modes where the resolution of the GUI is always native, but the regular scene can be scaled down to keep it from stuttering. Or using non-homogenous pixel grids with many pixels where differences are and fewer pixels in homogenous areas.

    AMD and nVidia wouldn't be interested in any of this, as they want to sell you a better card if your old one is too slow. Intels GPUs, on the other hand, are all slow to begin with. And if users are not satisfied with the experience they don't buy a bigger Intel GPU, they buy a discrete AMD or nVidia.
  • lordken - Saturday, March 7, 2015 - link

    Actually Intel should give up on their iGPU or rather make better use of it (if they intend to waste silion space). I bet 90% gamers do have it disabled,so for what good is such product? They should focus to make it possile to let igpu do some work (offload CPU with some instructions that GPU handle better), or even better, give us more CPU cores instead of lousy igpu.
    I admit it has its use on servers, HTPC and desktops that dont game. For gamers who are buying i7 igpu is for laugh and wasted money (you pay for it even if you never use ~2/3 of die)
  • darkfalz - Monday, March 9, 2015 - link

    It's not bad to have it just in case (ie. video card RMAd, troubleshooting etc). I don't think the cost of the silicon really adds much to your CPU cost (which are already pretty low historically). An i5 for example has 2MB disabled cache that is unused, but you don't pay for it. Similarly an i3 has two whole disabled cores and 4MB cache, but you don't pay for it.
  • darkfalz - Monday, March 9, 2015 - link

    Also, think of it this way - the disabled iGPU gives you a tiny bit more thermal headroom when it is disabled (ie. to account for the heat that needs to be dissipated from it if it were in use). So if it was not in the processor, you would likely not be able to get as high an overclock.
  • lordken - Monday, March 9, 2015 - link

    Well the silicon and the die space isnt free. If 1/3 of it is for GPU (the 2/3 before i took out of hat, just checked on 4770 review and its 1/3, maybe before it was more) that you never use, but could give you 2-3 more cores that you would be able to use anytime your PC is running is huge difference imo. For same money you could have ~50% more cores, now you have nothing.
    Anyways nothing is free. It cost intel money to manuf it, so they are going to charge you for it even if it is disabled and you never use it. Hell they could make like 30% CPU from same wafer if they drop GPU, thats like 4CPU vs 3CPU with useless iGPU

    I guess that if AMD would have better CPUs we wouldnt have this conversation as intel would be pushed to use all available die area for cores to have the edge over AMD.

    btw not sure where you see prices being pretty low. In here for some reason both i5 and i7 started to climb end of 2014 (by 30% in some cases) even before EUR weakened vs $$, so not sure whats happening...
  • darkfalz - Wednesday, March 11, 2015 - link

    "It cost intel money to manuf it, so they are going to charge you for it even if it is disabled and you never use it" - this isn't true at all. i3s to i7s come from the same wafer, just depends on how many parts work come the binning stage. Depending in the market, even working parts are often disabled to meet the demands of the lower priced parts. You think the cost of your CPU has anything to do with the cost to manufacture it? It costs Intel LESS to make all the CPUs on the same process. It would cost them more to produce CPUs without the GPU part as they'd need a separate fab.

Log in

Don't have an account? Sign up now