OpenCL Gets A CLU

Finally, on top of their OpenGL announcements Khronos is also making several smaller announcements related to some of their other initiatives. As most of these announcements fall outside of our traditional coverage areas we won’t go into great detail here, but there’s one item that falls under the domain of OpenCL: CLU.

For their SIGGRAPH OpenCL side project Khronos is announcing CLU, the Computing Language Utility (ed: Flynn Lives). Since the release of the first OpenCL specification in 2008, Khronos has been looking to expand upon OpenCL both with regards to features (resulting in additional iterations of the standard) and in improving the development tools for OpenCL. CLU falls under this latter effort.

In a nutshell, CLU is essentially a combination of a lightweight API and a template library intended to greatly simplify OpenCL development prototyping. As OpenCL was designed to be a relatively low level language (based on ANSI C), it takes quite a bit of effort to write a program from scratch due to all the facets of OpenCL that must be dealt with. CLU in turn provides a lightweight API that sits on top of a number of templates, the whole of which is designed to abstract away some of that complexity, allowing developers to get started more easily.

Ultimately developers of complex and high-performnace programs will still want to dive into the deepest layers of OpenCL. But for teaching and early prototyping Khronos believes this will significantly improve the OpenCL experience beyond the current paradigm of getting thrown into the deep end of the pool. For teaching beginners in particular, Khronos is hoping to get the process of writing their first OpenCL program down to under an hour as opposed to the much longer period of time it currently takes most beginners.

Finally, like many of their efforts, Khronos is looking to leverage the wider open source community to further improve CLU. Like the OpenGL Utility Toolkit (GLUT), the usefulness of CLU is based on how much functionality is implemented into the utility, and unlike GLUT it’s open source (under an Intel license), making it easy to fork and extend the utility.

Adaptive Scalable Texture Compression
Comments Locked

46 Comments

View All Comments

  • bobvodka - Monday, August 6, 2012 - link

    Firstly using the Steam Hardware survey, which is the correct metric as we are a AAA games studio I'll grant you, at most, 5% of the market, the majority of which have Intel GPUs, for which the OpenGL implementation has generally been.. sub-par to put it mildly.

    Secondly all console development tools are on the PC and based around Visual Studio as such we work in Windows anyway.

    Thirdly the Windows version generally comes about because we need artists/developer tools . Right now it is also useful for learning about and testing 'next gen' ideas with an API which will be close to the XBox API

    Forthly; we have a windows version working which uses D3D11 and OpenGL offers no compelling reason to scrap all the work. Remember D3D had working compute shaders with a sane integration for some years now - OpenGL has only just got these and before doing the work with OpenCL was like opening a horrible can of worms due to the lack of standardised and required interop extensions which existed (I looked into this at the back end of last year for my own work at home and quickly dispaired at the state of OpenGL and its interop).

    Finally, OSX lags OpenGL development. Currently OSX10.7.3 (as per https://developer.apple.com/graphicsimaging/opengl... ) supports GL3.2 and I see no mention of the version being supported in 10.8. Given that OpenGL3.2 was released in 2009 and OSX10.7 was released last year I wouldn't pin my hopes on seeing 4.2 any time 'soon'.

    Now, supporting 'down market' hardware is of course a good thing to do however in D3D11 this is easy (feature levels) in OpenGL different hardware + drivers = different features which again increases engineering work load and the requirements for fallbacks.
    You could mandate 'required features' but at that point you start cutting out market share and that 5% looks smaller and smaller.

    Now, we ARE putting engineering effort into OpenGL|ES as mobile devices are an important corner stone from a business stand point thus the cost can be justified.

    In short; there is no compelling business nor technical reason at this junction to drop D3D11 in favor of OpenGL to capture a fragment of the 5% AAA 'home computer' market when there are no side benefits and only cost.
  • powerarmour - Monday, August 6, 2012 - link

    Yes because Carmack is always 100% right about everything, and the id Tech 5 engine is the greatest and most advanced around.
  • SleepyFE - Monday, August 6, 2012 - link

    id Tech 5 is awesome!! I don't like shooters (except for Prey) but i played Rage just to see how much "worse" OpenGL is. The game looks GREAT. I can't tell it from any other AAA game from the graphics alone. And that means OpenGL is good enough and should be used more. Screw what someone says, try it yourself then tell me OpenGL can't compete.
  • bobvodka - Monday, August 6, 2012 - link

    False logic - games are as good as their art work.

    OpenGL has shaders, so yes with good art work it can do the same as D3D - however the API itself, the thing the programmers have to work with - isn't as good AND up until now it was lacking feature parity with D3D11.

    Feature wise OpenGL is there.
    API/usability wise - it isn't.

    FYI; I used OpenGL for around 8 years from around 1.3 until 3.0 came out and, like a few, was so fed up of the ARB at this point that I gave up on GL and moved to a modern API, speaking from an interface design point of view.
  • Penti - Friday, August 10, 2012 - link

    Game engines are perfectly fine supporting different graphics API's. Obviously non Windows platforms won't run D3D. Microsoft does not license it. So while they do license stuff like ActiveSync/Exchange, exFAT (which should have been included in the SDXC spec under FRAND-terms but isn't), NTFS, remote desktop protocols, OpenXML, binary document formats, sharepoint protocols, some of the .NET environment etc most of the vital tech is against payed licensing. They don't even specifies the Direct3D API's for implementation for none hardware vendors. It's simply not referenced at all. OpenGL is thoroughly referenced in comparison.

    Even though PS3 isn't OGL (PSGL is OGLES based) you could still do Cg shaders, or convert HLSL or GLSL shaders or vise versa so it's not like skills are lost. Tools should be written against the game engines and middleware any way.

    Plus the desktop OGL is compatible with OGLES when it comes to the newer releases such as 4.1 and 4.3. Albeit with some tricks/configuration/compatibility modes. Then implementations sucks, but that will also be true for some graphics chips support for DX.
  • inighthawki - Monday, August 6, 2012 - link

    The tessellation feature you're referring to is a brand-specific hardware extension, and not the same class that DirectX's tessellation is. The tessellation hardware introduced for DX11 is a completely programmable pipeline that offers more flexibility. DirectX does not add support for hardware specific features for good reason.
  • djgandy - Tuesday, August 7, 2012 - link

    Tessellation was only added to the GL pipeline in 4.0. It was another one of those 'innovations' where GL copied DX, just like pretty much every other feature GL adds.

    What GL needs to do is copy DX when they remove stuff from the API. Scratch this stupid core/compatibility model, which just adds even more run-time configurations, remove all the old rubbish and do not allow mixing of new features with the old fixed function pipeline.
  • bobvodka - Tuesday, August 7, 2012 - link

    There was, 4 years ago, a plan to do just what you described in your second paragraph - Longs Peak was the code name and it was a complete change and clean up of the API with a modern design and it was a change universally praised by those of us following the ARB's news letters and design plans.

    In July 2007 they were 'close' to a release; in October they had 'some issues' to work out - they then went into radio silence and 6 months later, without bothering to tell anyone what was going on, they rolled out 'OpenGL3.0' aka 2.2 where all the grand API changes, worked on for 2 years, were thrown out the window, extensions bolted on again and no functionality removed.

    At this point myself, and quite a few others, made a loud noise and departed OpenGL development in favour of D3D10 and then D3D11.

    Four years on the ARB are continuing down the same path and I wouldn't bet my future on them seeing sense any time soon.
  • djgandy - Tuesday, August 7, 2012 - link

    The ARB think they are implementing features that developers want, and maybe they are, but AFAIK they have very few big selling developers anyway.

    It seems the ARB is unable to see the reason behind this, maybe because they are so concerned about the politics of backwards compatibility or least certain members of it are. For me this is the hardest part to understand, since it is not even real breaking of compatibility, it is simply ring fencing new features from old features thus saving a ton of driver writing hell (i.e what DX did). Instead you can still use begin end with your glsl arb and geometry shaders with a bit of fixed function fog over the top. How useful.

    I find it hard to even consider the GL API as an abstraction layer with the existing extension hell and the multiple profiles a driver can opt to support. The end result of this "compatibility" is anyone actually wanting to sell software using OpenGL has to pick the lowest common denominator...whatever that actually is, because you don't even know what you are getting till run time with the newer API, so then you just pick the ancient version of the API because at least you have a 99% chance that a GL 3.0 driver will be installed with all the old fixed function crud that you don't actually need, but glVertex3f is nice right?

    IMO GL's only hope is for a company like Apple to put it into a high volume product and actually deliver a good contract to developers (core profile only, limited extensions, and say GL 4.0).
  • bobvodka - Tuesday, August 7, 2012 - link

    Unfortunately Apple isn't very on the ball when it comes to OpenGL support.

    OSX10.7, released last year, only supports OpenGL 3.2, a spec released in 2009 and had Windows support within 2 months.

    Apple are focusing on mobile it would seem, where OpenGL|ES is saner and rules the roost.

Log in

Don't have an account? Sign up now