A Word on Drivers and Compatibility

As we mentioned earlier, Ubuntu and the Linux kernel are open source projects, particularly under the GPL license. In large part due to the philosophies of the GPL, compared to Mac OS X and Windows, Linux handles drivers in a notably different fashion.

In a nutshell, the developers of the Linux kernel believe in the open source movement and wish for all related software to be open source. Furthermore they do not like the implications of attaching a closed source “binary blob” driver to the Linux kernel, because if something goes wrong it can be impossible to debug the issue if it occurs in the driver for which they do not have the code for. As such they have moral and technical objections to the Linux kernel supporting external drivers and actively prevent the creation of such drivers. This is done through mechanisms such as not having a fixed API for external drivers, and by not artificially keeping themselves from making changes to the kernel that would break external drivers. Drivers that they do have the code for can usually just be recompiled against the new kernel and are unaffected as a result. The result is that “binary blob” drivers are systematically opposed.

For the most part, this works fine. Not all hardware is supported under Linux because not everyone is willing to share the specifications and data needed to make a driver, but more than enough device manufacturers are willing to share such data that Linux generally supports non-esoteric hardware quite well. There is one class of notable hold-outs here however, and that’s the GPU manufacturers, namely ATI and NVIDIA.

Compared to other drivers, GPU drivers are different for two reasons. First is the sheer complexity of the drivers - besides interfacing with the hardware, the drivers are responsible for memory management, compiling/optimizing shader code, and providing a great deal of feedback. This in essence makes GPU drivers their own little operating system – one that its developers aren’t necessarily willing to share. The second significant difference here is because of the above, GPU drivers are among the only drivers that have a compelling reason to be updated regularly; they need to be updated to better support newer games and fix bugs in the complex code that runs through them.

Complicating matters further is that some intellectual property in GPUs and drivers is not the property of the company who makes the GPU. AMD doesn’t own everything in their Universal Video Decoder, and just about everyone has some SGI IP in their drivers. In the interest of protecting that IP, it is difficult to release the code for those drivers containing other companies’ IP.

Because of all of this, manufacturer-supplied GPU drivers are not always open source. Intel and S3 do well in this respect (largely because they have few tricks to hide, I suspect), but hyper-competitive NVIDIA and AMD do not. AMD has been looking to rectify this, and back in 2007 we discussed their starting work on a new open source driver. Development has been progressing slowly, and for the R6xx and R7xx hardware, the open source driver is not yet complete. Meanwhile NVIDIA has shown no real interest in an open source driver for their current hardware.

So if you want to use a modern, high-performance video card with Linux, you have little choice but to also deal with a binary blob driver for that card, and this becomes problematic since as we mentioned Linux is designed to discourage such a thing. Both AMD and NVIDIA have found ways around this, but the cost is that installing a binary driver is neither easy, or bug free.

The fundamental method that both use for accomplishing this is through the use of a kernel shim. Both analyze the headers for the kernel to identify how the kernel is organized, then they compile a shim against that kernel. The shim resolves the issues with the lack of a stable API, and the other end of the shim provides the stable API that NVIDIA and ATI need.

Ubuntu in particular takes this one step further, and in the interest of promoting greater out of the box hardware compatibility, includes a version of the binary drivers with the distribution. This is unusual for a Linux distribution and has earned Ubuntu some flak since it’s not strictly adhering to some open source ideals, but it also means that we were not forced to play with driver installers to get Ubuntu fully working. Ubuntu had no issues with both our AMD 2900XT and NVIDIA 8800GTX cards, both of which were picked specifically because we wished to test Ubuntu on suitably old hardware which would exist in time for Ubuntu to include support for it. With that said, the drivers Ubuntu includes are understandably old (once again owing to the idea of a stable platform) which means we can’t avoid installing drivers if we want better performance and application compatibility.

And this is where “easy” comes to an end. We’ll first start with AMD’s installer, the easier of the two. They have a GUI installer that puts in a driver along with a Linux version of the Catalyst Control Center. It’s Spartan, but it gets the job done.

NVIDIA on the other hand does not have a GUI installer – their installer is a text mode installer that requires shutting down the X server (the GUI) in order to install. It’s difficult to understate just how hard this makes driver installation. Not only is doing all of this completely non-obvious, but it requires interfacing with the CLI in a way we were specifically trying to avoid. It’s something that becomes bearable with experience, but I can’t call it acceptable.

Driver upgrades are an issue on both sides, because the installers are not completely capable of finding and eliminating older versions of the binary drivers. In one instance, for the NVIDIA drivers we had to track down a rather sizable shell script that automatically deleted the old drivers before installing the new ones, as that was deemed the “right” way to install the drivers. We had less of an issue with ATI’s drivers, but to be fair the primary card I used for my time with Ubuntu was the 8800GTX. I can’t confidently say that there are not other issues that I may have not run in to.

The Ubuntu community does supply tools to help with GPU driver installations, Once such tool is EnvyNG, which reduces the driver installation process to selecting what driver you want to install and it does the rest. This is a far easier way to install drivers, in the right situation it’s even easier than it already is under Windows. But it suffers from needing to have the latest driver data hardcoded in to it, which means you can only use it to install drivers it knows about, and nothing newer. It’s not regularly updated (as of this writing the latest driver versions it has are NV 173.14.12 and ATI Catalyst 8.6) so it’s good for installing newer drivers, but not the newest drivers.

The other tool is access to Ubuntu’s Personal Package Archives, which are a collection of user-built binaries that can be installed through the Ubuntu package manager (more on this later). It’s harder to use than EnvyNG, but anyone can build a PPA, which makes updates more likely. As it’s user-generated however, this still means that there won’t always be the latest drivers available, which means we’re still back to using ATI and NVIDIA’s installers.

As it stands, installing new GPU drivers on Ubuntu is between an annoyance and unbearable, depending on how many hoops you need to jump through. It’s certainly not easy.

The other problem with GPU drivers is that they do not always stay working. Among the issues we encountered was ATI’s driver failing to work after installing an Ubuntu update, and an NVIDIA driver that kept rebooting the system during testing for reasons we never determined (once we wiped the system, all was well).

Our final issue with the state of GPU drivers on Ubuntu is their overall quality. With a bit of digging we can come up with issues on both sides of the isle, so it’s not as if either side is clean here. But with that said, we only ended up experiencing issues with ATI’s drivers. We encountered some oddities when moving windows that was eventually fixed in the Catalyst 9.3 drivers. It turns out that the problem was that ATI’s drivers lacked support for redirected OpenGL rendering; Linux guru Phoronix has a great article on what this is, including videos, that explains the importance of this change.

Ultimately we hate to sound like we’re beating a dead horse here, but we can’t ignore the GPU driver situation on Ubuntu (and really, Linux as a whole). The drivers have too many issues, and installing newer drivers to fix those issues is too hard. Things could be worse, Ubuntu could only distribute driver updates with OS updates ala Apple, but they could also be better. For the moment it’s the weakest point for Ubuntu when it comes to installing it on a high-end system.

What’s the Value of Technical Support, Anyhow? The Package Manager – A Love/Hate Relationship
Comments Locked

195 Comments

View All Comments

  • jigglywiggly - Wednesday, August 26, 2009 - link

    I see you shared a lot of the same problems I had with Ubuntu when I first got it. Yeah, it's harder, I won't lie, and it's a pain in the ass when it doesn't work. But when it works, you love it, and you feel like more of a man. I use it for my web server, runs very nicely.

    Ubuntu sometimes makes you want to shoot it with a m249, but at other times you feel superior to other users. But that's because you are using the terminal all the time and are actually smart, Mac users just need to be shot in the face for their ignorance.
  • smitty3268 - Wednesday, August 26, 2009 - link

    I agreed with a lot of what was in this review.

    I think a lot of your problems would have gone away by using the newer versions, though, specifically with the package manager. There's much less need for finding things outside of it when you're using the new versions. Even video drivers can usually be put off for 6 months or so if you're not too cutting edge. Leaving the package manager behind is a pain, though, as you found out. You tried to explain that the LTS version was more comparable to Windows/OSX, but in truth very very few desktop users continue to use it. In fact, I'm not aware of any. It's really only used by companies for work machines who don't want to make large changes every 6 months like home users can.

    MSTT fonts. Good luck trying to get those by default, they're owned by microsoft who is in no mood to simply give them away to their competitors. Installing them is like installing the patent encumbered video codecs - at your own risk, which is minimal as long as you aren't trying to make money off of it.

    It should be mentioned that Red Hat put down some money to buy some nice new fonts a while ago, called Liberation, that are much nicer than the default serif ones this old Ubuntu version was using. Still different than the MS ones, though, which is going to cause some people problems. Also, the font anti-aliasing differences are again due to patents owned by other companies, but there's good news there. They're supposed to expire later this year so better font rendering in Linux should be coming soon! You can already get it working manually, but the distros make it hard to setup.

    You mentioned you chose Ubuntu because it was supposed to be user-friendly, which I regard as one of the more puzzling wide-spread myths that go around. Sure, it's a lot simpler than Debian, or some other choices, but it is definitely NOT the distro to choose if you're looking to avoid the CLI, as you found out.

    On that note, I would HIGHLY encourage you to eventually go back and do another review (part 3?) that uses a KDE based distro. Maybe try out OpenSUSE next fall, for example. Although KDE is going through a bit of a transition now, it's definitely where all the more interesting stuff is going on. As you said, Gnome is a lot like a boring Windows XP environment, which is both a positive and a negative. KDE is quite different, for better or worse, and is worth a look I think. For one thing, that smb://COMPUTERNAME address will work out of the box in KDE apps. If you do try KDE, I highly recommend another distro besides (K)Ubuntu, though, because they simply don't put any resources into their KDE implementation and it shows.
  • leexgx - Wednesday, August 26, 2009 - link

    Ubuntu KDE has more options to play with that are missing in gnome (but gnome top is far better then KDE top, long time i used linux its task monitor, Linux verson of windows XP task manager but only the process page but very detailed)

    Ubuntu should be easy to use but it lacks the easy install for drivers and Still does not offer Fail save VGA mode if X windows fails to start your stuck with an command line, it should try an second time but in save mode vga but it does not
  • Badkarma - Wednesday, August 26, 2009 - link

    Thought I'd mention a linux specific site Phoronix has an "Open Letter to Tech Review sites" (http://www.phoronix.com/scan.php?page=article&...">http://www.phoronix.com/scan.php?page=article&....

    You mentioned linux on Netbooks, and thought I would mention that I found Moblin(www.moblin.org) from Intel very impressive. It's still in beta and a little rough around the edges, but it boots faster than xp resumes from hibernate, around 15sec from bios screen and the UI is designed around small screens. After using it for a few hours and then installing Windows 7, I immediately missed how well Moblin was optimized for the lowres small screen. I had to install W7 because the ath9k kernel module drivers are unstable in Moblin, if not for this I would probably keep it as the primary OS on my netbook.
  • colonel - Wednesday, August 26, 2009 - link

    I ve been using Ubuntu 9.0 for a year with my Dell notebook and i love it, I dont see limitations in my work, the only problem is my company doesn't allow it in the network but is my OS in the house
  • Eeqmcsq - Wednesday, August 26, 2009 - link

    I'm still reading it, but on my xubuntu 8.04, my firefox is located in /usr/bin/firefox. Most apps are under /usr/bin.

    Also, the directory structure is definitely VERY different from Windows. One main difference is that everything that belongs to the user is supposed to be under /home. Everything that belongs to the "system" is everywhere else. I think the theory is that the user stuff is "sandboxed" in /home, so he doesn't mess things up in the system for everyone else.
  • Penti - Tuesday, September 1, 2009 - link

    You have the same in Windows under %SystemDrive%\Documents and Settings\user Although many settings are stored in the register (which can be said to be the equivalent of /etc). It's however there programs like Firefox saves it settings and where you have your My Documents and tempfiles.

    * %SystemDrive% is a variable and substitute for your systems drive letter on which Windows is installed which can be something other then C:.
  • fepple - Wednesday, August 26, 2009 - link

    On the normal Ubuntu install, the /usr/bin/firefox is actually a symlink that points to the firefox install in /usr/lib :)
  • ioannis - Wednesday, August 26, 2009 - link

    the question is, who cares where firefox or any other application's binary is installed? It's not as if you'll go searching for it to run it. They are on your execution 'PATH', which means you can just press ctrl+F2 and type their name, or a terminal, or access them from the application menu.

    My favourite way is to use something like gnome-go (or krunner in Kubuntu)

    PS: yes, all package manager provided application have their binaries in /usr/bin and most user build ones go in /usr/local/bin by default, which is also in your $PATH.
  • fepple - Wednesday, August 26, 2009 - link

    As a developer that has to deal with custom paths or managing symlinks in default paths, I can say I do care where binaries are located ;)

Log in

Don't have an account? Sign up now