It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

It’s Free – Libre Ubuntu – Long Term Support
Comments Locked

195 Comments

View All Comments

  • brennans - Sunday, August 30, 2009 - link

    I use both XP64 and Hardy (Ubuntu 8.04).
    I am also a power user.

    Both these operating systems have pros and cons.

    Cons for XP64:
    1. It does not recognize my hardware properly.
    2. Finding 64 bit drivers was/is a mission.

    Cons for Hardy:
    1. It does not plug and play with my hardware (i have to compile the drivers).
    2. Not as user friendly as windows.

    Pros for XP64:
    1. Windowing system is super fast.
    2. User friendly.

    Pros for Hardy:
    1. Recognizes my hardware.
    2. Command line tools are awesome.

    Conclusion:
    I think that the article was good.

    I am one of those people who has always had problems installing windows straight out of the box and thus find that paying a large amount of money for their buggy OS is unacceptable.

    I can get a lot of stuff done with Hardy and it is free and if I find a problem with it I can potentially fix that problem.

    I also find it unacceptable that manufacturers do not write software (drivers or application software for their devices) for Linux.

    For me, it is difficult to live without both XP64 and Hardy.



  • ciukacz - Sunday, August 30, 2009 - link

    http://www.iozone.org/">http://www.iozone.org/
  • JJWV - Sunday, August 30, 2009 - link

    How can people use something like Aero and its Linux or OSX equivalents (that pre-dates it if I am not mistaken) ? The noise is just hiding the information. Transparency is one issue, another are those icons that are more like pictures : one looses the instant recognition. With Aero knowing which is the active window is not something obvious, you have to look at small details. The title of the window is surrounded by mist making it more difficult to read. Even with XP the colour gradient in a title bar is just noise : there is no information conveyed by it.

    The OS GUIs are more and more looking like those weird media players, with an image of rotary button that is to be manipulated like a slide button.

    The evolution of all applications to a web interface reminds me of the prehistory of personal computers : each program has its own interface.

    The MS Office Ribbon UI is just in the same vein: more than 20 icons on each tab. The icon interface is based on instant recognition and comprehension, when you have so many it turns into a mnemonics exercise. And of course with MS one does not have a choice : you just have to adapt to the program. An end user is only there to be of service to the programs ;-)

    If i want to look at a beautiful image I will do it, but the when I want to write an letter or update a database all those ultra kitsch visual effects are just annoying.

    As a summary the noise is killing the information and thus the usability.
  • Ronald Pottol - Saturday, August 29, 2009 - link

    The thing with windows has been seen before, back in the win 3.1-OS/2 days it was found that while one instance of excel didn't run any faster under OS/2, two in separate VMs (ok, not technically the same thing) ran in about the same time as one on windows.

    I like the package management, and hate when I have to install something that doesn't support it, it means I have to worry about updates all my self. If they have one, they I get updates every time I check for Ubuntu updates, very handy. Nice to get the nightly Google Chrome builds, for instance (still alpha/very beta).

    Frankly, supporting binary kernel drivers would be insane. Now they are stuck supporting code they cannot look at and cannot fix, they cannot fix their mistakes (or are stuck emulating them forever). If they supported them, there would be even more of them, and when they wanted to fix something broken or that was a bad idea, they would have to wait a reasonable amount of time before doing it, so it would be supported. Frankly, I don't see why people don't have automated frameworks for doing this and automated deb/rpm repository generation. I add their repository, when I get a kernel update, perhaps it is held up a day for their system to automatically build a new version, but then it all installs, instead, I am stuck with having to run a very old kernel, or not having 3D on my laptop, for instance.
  • cesarc - Sunday, August 30, 2009 - link

    I found this article very interesting, because is oriented to windows user and is helpful to them because you just didn't die trying it.
    But you can't blame ubuntu (or any distro), about the pain in the ass a video card's drive could be to install, blame ati and nvidia for been lazy, and if using wine for playing games is not as good as playing in windows blame games company for don't release a GNU/linux version.
    Also, the thing about why GNU/linux overpass windows in file management is because ntfs is a BAD file system, maybe if windows somehow could run under ext3 would be even better than it is.
    And why your negligence to use a console (stop saying cli please), you are not opening your mind trying to use GNU/linux as a windows just because it is not windows is a completely different os. Look from this point of view... something that you can do in windows with 5 clicks maybe you can do it in GNU/linux in just one line of bash code. So, sometimes you will use GUI and others you will use console and you will find that having this options is very comfortable. So start using the console and do the same article a year later.
    I hope some day have a paid version of GNU/linux (still open source), that could pay salaries to programers to fix specific issues in the OS.
    In the other hand, when you do the IT benchmark is very disappointing that you don't use linux with those beautiful Xeons. Servers environment is were GNU/linux get stronger. And Xeons with windows are just toys compared with unix on sparcs or power architectures.

    PS: try to get 450 days of uptime in a windows 2003.
  • rkerns - Saturday, August 29, 2009 - link

    Ryan,

    Thanks for your good work.

    Many people considering linux are still on dial-up. These are often folks with lesser expertise who just want to get connected and use their computer in basic ways. But getting connected with dial-up is something of an adventure with many distros and/or versions. Ubuntu 9.04 has moved away from easy dial-up, but Mint7KDE includes KPPP for simple dial-up connection. Mint7KDE has other nice features as well.

    I am asking you to expand your current picture of the landscape to include people who want to use linux with a dial-up connection. This of course would have to include a brief discussion of 1) appropriate modems and 2) distro differences. Thanks,
    r kerns
  • William Gaatjes - Saturday, August 29, 2009 - link

    GRATIS

    hijg hijg hijg hijg

    hijg hijg hijg hijg
  • lgude - Saturday, August 29, 2009 - link

    Really glad to find this in depth article after all this time. Thank you Ryan. I too have run Ubuntu as my main OS even though most of my experience is in Windows and have had similar experiences. Because this was a very long article it got into detail about things like the Package Manager or the multiple desktops that I have not seen discussed elsewhere from a user perspective. As someone else pointed out it is moot what people would like or complain about if they were moving from Linux to Windows or OSX, but imagine for a moment if they were used to getting the OS and all their apps updated in one hit and were asked to do it one app at a time and expected to pay for the privilege!

    If you go on with the Linux series I'd like to see discussion of the upcoming Ubuntu and other distros - I've been impressed with SUSE. I'd also like to see projects on how to build a Linux server and HTPC - including choice of distro and the kind of hardware needed. I'm less sure of where benchmarking is really useful - the tradition of detailed benchmarking at AT arose from the interest in overclocking and gaming which I think is a much lesser consideration in Linux. More relevant might be comparisons of netbook specific distros or how to work out if that old P4 will do as a home server. There is a lot of buzz in the tech world about things like Symbion, Chrome OS, Moblin, Maemo on portable devices that could possibly draw new readers to the Linux tab at AT. A great start in any case.
  • jmvaughn - Saturday, August 29, 2009 - link

    I just wanted to say thank you to the author for a very thorough article. After reading it, I decided to use Ubuntu for a PC I'm building out of spare parts for a retired friend who's on fixed income. My friend just uses web, e-mail, and some word processing, so this will be perfect.

    The article gave me a good idea of what to expect -- a good honest appraisal with all the good and bad. After installing Ubuntu 9.0.4, I am very impressed. The install was very quick, and easier than XP. Everything is quite snappy, even though it's running on a AMD 3800+ single core processor and an old hard drive.

  • xchrissypoox - Saturday, August 29, 2009 - link

    I only skimmed the article (I saw the part on gaming being poor), I'd like to see a comparison of several games using the same hardware on windows and linux (results given in fps). If this has been mentioned sorry and good day.

Log in

Don't have an account? Sign up now