A Word on Drivers and Compatibility

As we mentioned earlier, Ubuntu and the Linux kernel are open source projects, particularly under the GPL license. In large part due to the philosophies of the GPL, compared to Mac OS X and Windows, Linux handles drivers in a notably different fashion.

In a nutshell, the developers of the Linux kernel believe in the open source movement and wish for all related software to be open source. Furthermore they do not like the implications of attaching a closed source “binary blob” driver to the Linux kernel, because if something goes wrong it can be impossible to debug the issue if it occurs in the driver for which they do not have the code for. As such they have moral and technical objections to the Linux kernel supporting external drivers and actively prevent the creation of such drivers. This is done through mechanisms such as not having a fixed API for external drivers, and by not artificially keeping themselves from making changes to the kernel that would break external drivers. Drivers that they do have the code for can usually just be recompiled against the new kernel and are unaffected as a result. The result is that “binary blob” drivers are systematically opposed.

For the most part, this works fine. Not all hardware is supported under Linux because not everyone is willing to share the specifications and data needed to make a driver, but more than enough device manufacturers are willing to share such data that Linux generally supports non-esoteric hardware quite well. There is one class of notable hold-outs here however, and that’s the GPU manufacturers, namely ATI and NVIDIA.

Compared to other drivers, GPU drivers are different for two reasons. First is the sheer complexity of the drivers - besides interfacing with the hardware, the drivers are responsible for memory management, compiling/optimizing shader code, and providing a great deal of feedback. This in essence makes GPU drivers their own little operating system – one that its developers aren’t necessarily willing to share. The second significant difference here is because of the above, GPU drivers are among the only drivers that have a compelling reason to be updated regularly; they need to be updated to better support newer games and fix bugs in the complex code that runs through them.

Complicating matters further is that some intellectual property in GPUs and drivers is not the property of the company who makes the GPU. AMD doesn’t own everything in their Universal Video Decoder, and just about everyone has some SGI IP in their drivers. In the interest of protecting that IP, it is difficult to release the code for those drivers containing other companies’ IP.

Because of all of this, manufacturer-supplied GPU drivers are not always open source. Intel and S3 do well in this respect (largely because they have few tricks to hide, I suspect), but hyper-competitive NVIDIA and AMD do not. AMD has been looking to rectify this, and back in 2007 we discussed their starting work on a new open source driver. Development has been progressing slowly, and for the R6xx and R7xx hardware, the open source driver is not yet complete. Meanwhile NVIDIA has shown no real interest in an open source driver for their current hardware.

So if you want to use a modern, high-performance video card with Linux, you have little choice but to also deal with a binary blob driver for that card, and this becomes problematic since as we mentioned Linux is designed to discourage such a thing. Both AMD and NVIDIA have found ways around this, but the cost is that installing a binary driver is neither easy, or bug free.

The fundamental method that both use for accomplishing this is through the use of a kernel shim. Both analyze the headers for the kernel to identify how the kernel is organized, then they compile a shim against that kernel. The shim resolves the issues with the lack of a stable API, and the other end of the shim provides the stable API that NVIDIA and ATI need.

Ubuntu in particular takes this one step further, and in the interest of promoting greater out of the box hardware compatibility, includes a version of the binary drivers with the distribution. This is unusual for a Linux distribution and has earned Ubuntu some flak since it’s not strictly adhering to some open source ideals, but it also means that we were not forced to play with driver installers to get Ubuntu fully working. Ubuntu had no issues with both our AMD 2900XT and NVIDIA 8800GTX cards, both of which were picked specifically because we wished to test Ubuntu on suitably old hardware which would exist in time for Ubuntu to include support for it. With that said, the drivers Ubuntu includes are understandably old (once again owing to the idea of a stable platform) which means we can’t avoid installing drivers if we want better performance and application compatibility.

And this is where “easy” comes to an end. We’ll first start with AMD’s installer, the easier of the two. They have a GUI installer that puts in a driver along with a Linux version of the Catalyst Control Center. It’s Spartan, but it gets the job done.

NVIDIA on the other hand does not have a GUI installer – their installer is a text mode installer that requires shutting down the X server (the GUI) in order to install. It’s difficult to understate just how hard this makes driver installation. Not only is doing all of this completely non-obvious, but it requires interfacing with the CLI in a way we were specifically trying to avoid. It’s something that becomes bearable with experience, but I can’t call it acceptable.

Driver upgrades are an issue on both sides, because the installers are not completely capable of finding and eliminating older versions of the binary drivers. In one instance, for the NVIDIA drivers we had to track down a rather sizable shell script that automatically deleted the old drivers before installing the new ones, as that was deemed the “right” way to install the drivers. We had less of an issue with ATI’s drivers, but to be fair the primary card I used for my time with Ubuntu was the 8800GTX. I can’t confidently say that there are not other issues that I may have not run in to.

The Ubuntu community does supply tools to help with GPU driver installations, Once such tool is EnvyNG, which reduces the driver installation process to selecting what driver you want to install and it does the rest. This is a far easier way to install drivers, in the right situation it’s even easier than it already is under Windows. But it suffers from needing to have the latest driver data hardcoded in to it, which means you can only use it to install drivers it knows about, and nothing newer. It’s not regularly updated (as of this writing the latest driver versions it has are NV 173.14.12 and ATI Catalyst 8.6) so it’s good for installing newer drivers, but not the newest drivers.

The other tool is access to Ubuntu’s Personal Package Archives, which are a collection of user-built binaries that can be installed through the Ubuntu package manager (more on this later). It’s harder to use than EnvyNG, but anyone can build a PPA, which makes updates more likely. As it’s user-generated however, this still means that there won’t always be the latest drivers available, which means we’re still back to using ATI and NVIDIA’s installers.

As it stands, installing new GPU drivers on Ubuntu is between an annoyance and unbearable, depending on how many hoops you need to jump through. It’s certainly not easy.

The other problem with GPU drivers is that they do not always stay working. Among the issues we encountered was ATI’s driver failing to work after installing an Ubuntu update, and an NVIDIA driver that kept rebooting the system during testing for reasons we never determined (once we wiped the system, all was well).

Our final issue with the state of GPU drivers on Ubuntu is their overall quality. With a bit of digging we can come up with issues on both sides of the isle, so it’s not as if either side is clean here. But with that said, we only ended up experiencing issues with ATI’s drivers. We encountered some oddities when moving windows that was eventually fixed in the Catalyst 9.3 drivers. It turns out that the problem was that ATI’s drivers lacked support for redirected OpenGL rendering; Linux guru Phoronix has a great article on what this is, including videos, that explains the importance of this change.

Ultimately we hate to sound like we’re beating a dead horse here, but we can’t ignore the GPU driver situation on Ubuntu (and really, Linux as a whole). The drivers have too many issues, and installing newer drivers to fix those issues is too hard. Things could be worse, Ubuntu could only distribute driver updates with OS updates ala Apple, but they could also be better. For the moment it’s the weakest point for Ubuntu when it comes to installing it on a high-end system.

What’s the Value of Technical Support, Anyhow? The Package Manager – A Love/Hate Relationship
Comments Locked

195 Comments

View All Comments

  • sadf23ssaaa - Monday, March 22, 2010 - link

    Welcome to our website:

    jordan air max oakland raiders $34--39;
    Ed Hardy AF JUICY POLO Bikini $25;
    Christan Audigier BIKINI JACKET $25;
    gstar coogi evisu true jeans $35;
    coach chanel gucci LV handbags $36;
    coogi DG edhardy gucci t-shirts $18;
    CA edhardy vests.paul smith shoes $32;
    jordan dunk af1 max gucci shoes $37;
    EDhardy gucci ny New Era cap $16;
    coach okely Adidas CHANEL DG Sunglass $18;
  • zerobug - Monday, February 1, 2010 - link

    Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
    So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
    For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
  • abnderby - Thursday, September 3, 2009 - link

    Let me say this, I am a Senior Software QA Engineer, I have been testing windows, windows apps, DB's and web sites for over 10 year now. I am what you could consider an windows guru of sorts.

    I have off an on always gone and tried linux from red hat 5, 6, ubuntu, suse, fedora etc... Linux is not and has not been ready for mainstream users. Sure simple email, word docs web browsing it is ok.

    But in order to do many things I want to do and many advanced windows users the author and many commentors are right. Linux people need to get out of their little shell and wake up.

    Linux has such great potential to be a true contenderto windows and OSX. But it lacks simple usability. Out of the box it can come nowhere close to MS or Apple offerings. The out of the box experience is truly horrible.

    Hardware drivers? good luck I run RAID cards that have no support. Forget the newest graphics and sound cards. Connecting to shares just as the author mentioned a hassle of a work around.

    Again as stated elsewhere Linux needs someone who programs and or scripts to get things done right. I have neitherthe time or patience for such. I use command line when needed. I would rather have 2 or 3 clicks and I am done then have to remember every CLI for every thing I need to do.

    Time is money, time is not a commodity. Linus wastes too much time.

    It is geting better with each distro true. But It has been 11 years from red hat 5?? and Linux is not a whole lot better than it was then.

    What is needed if Linux really wants to make a stand in the desktop space, is a unified pull togeher ofall distro's. Sit down and truly plan out the desktop. Put together a solid platform that out of the box can really put the hurt on MS or Apple.

    Look what Apple did with OSX! And how many developers are wrking on it? How many developers are working on Linux all distro's? OSX is a jewel in 7 years it has matured much farther than any *nix distro. And has a following that cannot yet be challenged by any distro available.

    Why is it that when win2k came out Linux was claiming to be superior, and yet after 10 years of development it is hardly comparable to XP let alonevista/win 7 or OSX?

    You guys really need to wake up and smell the coffee!

  • Penti - Monday, September 7, 2009 - link

    Of course it's not ready for consumer desktops, there are no serious distributions for that.

    It means no DVD player OOB, no proprietary codecs, no video editing software, no proprietary drivers which works magically. Of course not is SLED and RHEL Desktop ready for normal users it's targeted for Linux admins to set up the environment. Community distributions won't have as easy time to be set up by those. Community distros will also always lack the above mentioned stuff. It's simply not legal for them to offer it OOB. OS X is actually older then Linux and ran on x86 before Apple bought Jobs NeXT company. It's also supported by an OEM. (OEM = Themselves). Which no Linux dist is. It also uses many GNU technologies like GCC, X11 (optional but included on disc), bash shell and so on, and of course SAMBA for SMB/CIFS, on the server edition they use a modified openldap server, dovecot and postfix for mail, Apache, PHP, Perl, MySQL etc. Stuff thats developed on Linux and has matured thanks to it.

    There's a lot of problems with having just community supported stuff, but that doesn't mean it's useless or sucks. Sure the kernel aren't really helping getting drivers in there, by locking out closed source stuff but they end up useless if they are proprietary and not updated any way. For the servers just buy RHEL or SLES certified stuff and you get all the hardware support-needed. But on the other hand you wouldn't be much successful in running 7 year old video drivers in Windows either. Community distros definitively don't need to cease existing for the creation of a commercial one. But there will never be one linux and that's really the beauty of it not the course. It wasn't meant to be something rivaling windows and the kernel developers has no desire to create a distro. That's why we can see Linux in stuff like Android and Maemo. And from home routers to mainframes and supercomputers. For a commercial entity targeting that many devices wouldn't be possible. Not with the same basic code and libraries. There are definitively some top notch products and solutions based on Linux and GNU. But Linux doesn't want anything as it's not an entity. And it's really up to GNOME and KDE to create the desktop environment. It's not the distros that shape them and write all the libraries that software developers use to create their software. As there are no major consumer desktop distro maker there is also no one that can really steer them by sponsoring work and holding discussions either. Not towards a unified desktop environment for normal non-tech users anyway. Also GNOME and KDE has no desire to create a exclusive platform around their software. OS X is a innovative 20 year old OS (since commercial release) and is actually based on work before then (BSD code). OS X UI is really 20 years into it's making and builds heavily on the next/openstep framework. On other Unixes there hasn't been any such heritage to build on, X was an total mess on commercial Unixes and I would actually say it's a lot better and more streamline now. There's just Xorg now, sure there are a lot of window managers but only two major environments now so it's still better then when all the vendors had it's own and couldn't make up it's mind on which direction to go and standardize on. In the middle of the 90's there where like at least 4 major Unix vendors that all had their own workstations.
  • fazer150 - Friday, September 4, 2009 - link

    which Linux distro have you tried? did you try the PCLinuxOS which is atleast as usable as windows xp, 2003?
  • nilepez - Sunday, August 30, 2009 - link

    Most end users are not comfortable with the command line. Linux, even Ubuntu, is still not ready for the masses. This shouldn't be confused with the quality of the OS. It's mostly GUI issue. I've also had some issues with installers failing. Some were solved from an xterm and others just didn't work.

    It wasn't a big deal in most cases, because there's generally another program that can get the job done, but for the typical home user, it's a deal killer. Nevertheless, I must give credit where credit is due, and Ubuntu has made huge strides in the right direction. The UI isn't close to Windows 7 and I suspect it's not close to OS X either, but Canonical is moving in the right direction.

  • Etern205 - Thursday, August 27, 2009 - link

    See this is the problem with some of linux users, you guys are some what always closed in a nutshell. What you may think is easy does not mean the rest of the world will agree with you. In this day and age, people what to get things done quickly and use the least amount of time as possible. For Mac OS X and Windows getting a simple task done takes like 3 simple clicks, for Ubuntu performing the same tasks requires the user to do extensive amount of research just to complete it.

    I'm glad this article was written by a author who has not head into linux terriroty before and it shows the true side of linux from the perspective of a new user.

    If you like to do ramen coding and so forth does not mean the others will. If linux want's to become mainstream, then they really need to stand in the shoes of Joe or Jane.
  • forkd - Saturday, October 31, 2009 - link

    I use mac, windows and linux and I must disagree with your assessment of "this is the problem with some linux users"

    This article, and this site for that matter, comes from the perspective of a windows (and some mac) user looking at linux. More specifically Ubuntu. From this point of view of course Linux is difficult. A person who is linux focused thinks windows is difficult at first too and is likely to criticize. If you take the time to learn something instead of just criticizing something because it is different you may be a lot happier.
  • fepple - Thursday, August 27, 2009 - link

    Check out all the usability studies the Gnome Project does, then come back and make some more generalization :)
  • SoCalBoomer - Thursday, August 27, 2009 - link

    Again - those are done by Linux people. His points are right on. . .someone a while ago did a "Mom" test, which is closer to what is needed, not people who know computers doing studies on usability.

Log in

Don't have an account? Sign up now