A Word on Drivers and Compatibility

As we mentioned earlier, Ubuntu and the Linux kernel are open source projects, particularly under the GPL license. In large part due to the philosophies of the GPL, compared to Mac OS X and Windows, Linux handles drivers in a notably different fashion.

In a nutshell, the developers of the Linux kernel believe in the open source movement and wish for all related software to be open source. Furthermore they do not like the implications of attaching a closed source “binary blob” driver to the Linux kernel, because if something goes wrong it can be impossible to debug the issue if it occurs in the driver for which they do not have the code for. As such they have moral and technical objections to the Linux kernel supporting external drivers and actively prevent the creation of such drivers. This is done through mechanisms such as not having a fixed API for external drivers, and by not artificially keeping themselves from making changes to the kernel that would break external drivers. Drivers that they do have the code for can usually just be recompiled against the new kernel and are unaffected as a result. The result is that “binary blob” drivers are systematically opposed.

For the most part, this works fine. Not all hardware is supported under Linux because not everyone is willing to share the specifications and data needed to make a driver, but more than enough device manufacturers are willing to share such data that Linux generally supports non-esoteric hardware quite well. There is one class of notable hold-outs here however, and that’s the GPU manufacturers, namely ATI and NVIDIA.

Compared to other drivers, GPU drivers are different for two reasons. First is the sheer complexity of the drivers - besides interfacing with the hardware, the drivers are responsible for memory management, compiling/optimizing shader code, and providing a great deal of feedback. This in essence makes GPU drivers their own little operating system – one that its developers aren’t necessarily willing to share. The second significant difference here is because of the above, GPU drivers are among the only drivers that have a compelling reason to be updated regularly; they need to be updated to better support newer games and fix bugs in the complex code that runs through them.

Complicating matters further is that some intellectual property in GPUs and drivers is not the property of the company who makes the GPU. AMD doesn’t own everything in their Universal Video Decoder, and just about everyone has some SGI IP in their drivers. In the interest of protecting that IP, it is difficult to release the code for those drivers containing other companies’ IP.

Because of all of this, manufacturer-supplied GPU drivers are not always open source. Intel and S3 do well in this respect (largely because they have few tricks to hide, I suspect), but hyper-competitive NVIDIA and AMD do not. AMD has been looking to rectify this, and back in 2007 we discussed their starting work on a new open source driver. Development has been progressing slowly, and for the R6xx and R7xx hardware, the open source driver is not yet complete. Meanwhile NVIDIA has shown no real interest in an open source driver for their current hardware.

So if you want to use a modern, high-performance video card with Linux, you have little choice but to also deal with a binary blob driver for that card, and this becomes problematic since as we mentioned Linux is designed to discourage such a thing. Both AMD and NVIDIA have found ways around this, but the cost is that installing a binary driver is neither easy, or bug free.

The fundamental method that both use for accomplishing this is through the use of a kernel shim. Both analyze the headers for the kernel to identify how the kernel is organized, then they compile a shim against that kernel. The shim resolves the issues with the lack of a stable API, and the other end of the shim provides the stable API that NVIDIA and ATI need.

Ubuntu in particular takes this one step further, and in the interest of promoting greater out of the box hardware compatibility, includes a version of the binary drivers with the distribution. This is unusual for a Linux distribution and has earned Ubuntu some flak since it’s not strictly adhering to some open source ideals, but it also means that we were not forced to play with driver installers to get Ubuntu fully working. Ubuntu had no issues with both our AMD 2900XT and NVIDIA 8800GTX cards, both of which were picked specifically because we wished to test Ubuntu on suitably old hardware which would exist in time for Ubuntu to include support for it. With that said, the drivers Ubuntu includes are understandably old (once again owing to the idea of a stable platform) which means we can’t avoid installing drivers if we want better performance and application compatibility.

And this is where “easy” comes to an end. We’ll first start with AMD’s installer, the easier of the two. They have a GUI installer that puts in a driver along with a Linux version of the Catalyst Control Center. It’s Spartan, but it gets the job done.

NVIDIA on the other hand does not have a GUI installer – their installer is a text mode installer that requires shutting down the X server (the GUI) in order to install. It’s difficult to understate just how hard this makes driver installation. Not only is doing all of this completely non-obvious, but it requires interfacing with the CLI in a way we were specifically trying to avoid. It’s something that becomes bearable with experience, but I can’t call it acceptable.

Driver upgrades are an issue on both sides, because the installers are not completely capable of finding and eliminating older versions of the binary drivers. In one instance, for the NVIDIA drivers we had to track down a rather sizable shell script that automatically deleted the old drivers before installing the new ones, as that was deemed the “right” way to install the drivers. We had less of an issue with ATI’s drivers, but to be fair the primary card I used for my time with Ubuntu was the 8800GTX. I can’t confidently say that there are not other issues that I may have not run in to.

The Ubuntu community does supply tools to help with GPU driver installations, Once such tool is EnvyNG, which reduces the driver installation process to selecting what driver you want to install and it does the rest. This is a far easier way to install drivers, in the right situation it’s even easier than it already is under Windows. But it suffers from needing to have the latest driver data hardcoded in to it, which means you can only use it to install drivers it knows about, and nothing newer. It’s not regularly updated (as of this writing the latest driver versions it has are NV 173.14.12 and ATI Catalyst 8.6) so it’s good for installing newer drivers, but not the newest drivers.

The other tool is access to Ubuntu’s Personal Package Archives, which are a collection of user-built binaries that can be installed through the Ubuntu package manager (more on this later). It’s harder to use than EnvyNG, but anyone can build a PPA, which makes updates more likely. As it’s user-generated however, this still means that there won’t always be the latest drivers available, which means we’re still back to using ATI and NVIDIA’s installers.

As it stands, installing new GPU drivers on Ubuntu is between an annoyance and unbearable, depending on how many hoops you need to jump through. It’s certainly not easy.

The other problem with GPU drivers is that they do not always stay working. Among the issues we encountered was ATI’s driver failing to work after installing an Ubuntu update, and an NVIDIA driver that kept rebooting the system during testing for reasons we never determined (once we wiped the system, all was well).

Our final issue with the state of GPU drivers on Ubuntu is their overall quality. With a bit of digging we can come up with issues on both sides of the isle, so it’s not as if either side is clean here. But with that said, we only ended up experiencing issues with ATI’s drivers. We encountered some oddities when moving windows that was eventually fixed in the Catalyst 9.3 drivers. It turns out that the problem was that ATI’s drivers lacked support for redirected OpenGL rendering; Linux guru Phoronix has a great article on what this is, including videos, that explains the importance of this change.

Ultimately we hate to sound like we’re beating a dead horse here, but we can’t ignore the GPU driver situation on Ubuntu (and really, Linux as a whole). The drivers have too many issues, and installing newer drivers to fix those issues is too hard. Things could be worse, Ubuntu could only distribute driver updates with OS updates ala Apple, but they could also be better. For the moment it’s the weakest point for Ubuntu when it comes to installing it on a high-end system.

What’s the Value of Technical Support, Anyhow? The Package Manager – A Love/Hate Relationship
Comments Locked

195 Comments

View All Comments

  • viciki123 - Monday, February 22, 2010 - link

    I have a new website!wedding dresses uk:http://www.weddingdressonlineshop.co.uk">http://www.weddingdressonlineshop.co.uk
  • Jillian - Thursday, December 29, 2011 - link

    YOu are spamer .For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities. http://www.idresses.co.uk
  • zerobug - Monday, February 1, 2010 - link

    Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
    So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
    For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
  • MarcusAsleep - Thursday, December 17, 2009 - link

    Quick Startup: OK, Windows is fast - at first, well let's say that is if you install it yourself without all the bloatware that come standard on Windows store-bought PS's (we bought a Toshiba laptop for work with Vista that took 12 minutes after boot-up for it to respond to a click on the start menu - even on the third time booting.)

    Windows startup is often burdened by auto-updates from Microsoft, anti-virus, Sun-Java, Acrobat Reader, etc. etc. that slow down the computer on boot-up to where your original idea of "hey I just want to start my computer and check my email for a minute before work" can take at least 5. I can do this reliably on Linux in 1. Yes, if you know a lot about Windows, you can stop all the auto-updates and maintain them yourself but 99% of Windows users don't have time/or know how to do this.

    Trouble-free: E.G. I installed Linux on a computer for my wife's parents (Mepis Linux) 7 years ago for email, pictures, games, web, letter use and haven't had to touch it since then. This is typical.

    For Windows, often I have done fresh installs on trojan/virus infected computers - installed working antivirus and all Windows Updates (not to mention this process takes about 2-4 hours of updates upon updates + downloads of the proper drives from the manufacturers websites vs about 1 hour for an Ubuntu install with all updates done including any extra work for codecs and graphic drivers) - only to have to come back a couple months later to a slow again computer from users installing adware, infected games, etc.

    Free: Nearly every Windows reinstall I've had to do starts with a computer loaded with MS Office, games, etc. but upon reinstall nobody has the disks for these. There is a lot of "sharing" of computer programs in the Windows world that is not very honest

    With Linux, you can have the operating system plus pretty much anything else you would need, without having to cheat.

    Adaptable Performance: You can get a well-performing Linux installation (LXDE Desktop) on a PIII computer with 256MB of ram. The only thing that will seem slow to an average mom/pop user would be surfing on flash loaded web pages, but with adblock on Firefox, it's not too bad. With Vista loaded on this computer, it would be annoyingly slow. You can often continue to use/re-use computer hardware with Linux for years after it would be unsuitable for Windows.

    I think these features are of high value to the average user -- maybe not the average Anandtech computer user -- but the average surf/email/do homework/look at photos/play solitaire/balance my checkbook user.

    Cheers!

    Mark.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    Using SMB for network performance is extremely biased. It's a proprietary Microsoft protocol, of course Microsoft is going to win that one. Use NFS, HTTP, FTP, SSH or some other open protocol for network performance benchmarking. Alot of NASes do support these, as they are Linux-based.

    Furthermore, using a Windows server with SMB with the argument that most consumer NAS use SMB is pretty ridiculous, these NASes are most likely going to use Samba, not native SMB, the Samba which is implemented in GNU/Linux distributions and Mac OS X, not to mention that most of the NASes that I've seen offer at least one of these protocols as an alternative.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    The ISO thing is pretty ridiculous, creating a simple GUI in both GTK and Qt and integrating them into Gnome and KDE should be pretty damn easy, though I suppose integration with the respective virtual file systems would be in order, in which case it might get slightly more complex for those (like me) not familiar with the code. There's even a FUSE (userspace filesystem) module now, so you wouldn't even need to be root to mount it.

    About the command-line support, IMO that's a good thing. It's a lot easier both for the person helping and the guy needing help to write/copy-paste a few commands than it is to tell the person to click that button, then that one then another one, etc. It's also alot easier for the guy needing help to simply paste the result if it didn't help, and it makes it much easier to diagnose the problem than if the user would attempt to describe the output. And you usually get much more useful information from the command-line utilities than you do from GUIs, the GUI simplifies the information so anyone can understand it, but at the price of making debugging a hell of a lot more difficult.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • seanlee - Tuesday, September 15, 2009 - link

    I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely hate it.
    1. User friendly. No, CLI is not user friendly no matter what you say; no matter what excuse you use; no matter how blind you are. NOT ONE COMPANY dare to provide their mainstream products to be CLI only, from things as simply as ATM, ipod, to things as complicate as cellphone, cars, airplane. That ought to tell you something--- CLI is not intuitive, not intuitive=sucks, so CLI = sucks. You command line fan boys are more than welcome to program punched cards, expect no one use punched cards and machine language anymore because they are counter-intuitive. Having to do CLI is a pain for average user, and having to do CLI every time to install a new program/driver is a nightmare. GUI is a big selling point, and a gapless computer-human user experience is what every software company looking to achieve.
    2. There is NOTHING a Linux can do that windows cannot. On the contrary, there are a lot of things windows can do that Linux cannot. I’d like to challenge any Linux user to find engineering software alternatives on Linux, like matlab, simulink, xilinx, orcad, labview, CAD… you cannot. For people who actually user their computer for productive means (not saying typing documents are not productive, but you can type document using type writer with no CPU required whatsoever), there is nothing, again, I repeat, NOTHING that Linux can offer me.
    3. Security issues. I disagree with the security issues that windows has. I can set up a vista machines, turn it on, luck it into a cage, and it will be as security as any Linux machine out there. Hell. If I bought a piece of rock, pretend it was a computer and stare it all day, it would be the most secure system known to the man-kind. Linux’s security is largely due to one of the two reasons: 1. Not popular, not enough software to support and to play with. 2. Not popular, un user-friendly. Either of them is not a good title to have. It is like you are free from the increase of the tax not because you have your business set up to write off all your expense, but because you don’t make any money thus you don’t have to pay tax.
    4. There is nothing revolutionary about Linux for an average user, other than it is free. If free is your biggest selling point, you are in serious trouble. Most people, if not all, would pay for quality product than a free stuff, unless it is just as good. Obviously Ubuntu is never going to be as good as windows because they don’t have the money that MS has. So what does Ubuntu have that really makes me want to switch and take few weeks of class to understand those commands?

    Be honest, people. If you only have ONE O/S to use, most of you guys will chose windows.
  • kensolar - Monday, October 26, 2009 - link


    I hope you realize that your hated is showing so strongly that absolutely no one cares what you say.
    That said, I don't know how to use a cli and have been successfully using Linux for 3 years. I found the article to be a fairly fair one even though the author is so unfamiliar with Linux/Ubuntu. As he does not use the default app's in windows, linux users don't use the defaults only in linux. K3B is far superior to Brassaro and so on. In addition, I don't think he let on very well as to the extent of the software available in the repositories (with addition repositories easy to add). Several hundred app's, 20,000 programs, even security app's and programs ranging from easy as pie to complicated. (for those of us how have a computer that is more than a rock) I personally do audio mixing, video transcoding, advanced printing....all with graphic interfaces.
    BTW, I learned how to turn on a computer for the 1st time 3 1/2 years ago, I stopped using windows a little over 3 years ago and have no reason to go back. I find it too hard, limiting and frustrating to use. Plus, I can't live w/o multiple desktops, the author didn't get it yet, but once you get used to them you can't go back.
    Well, I've said enough for now, can't wait for your next article.

Log in

Don't have an account? Sign up now