It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

It’s Free – Libre Ubuntu – Long Term Support
Comments Locked

195 Comments

View All Comments

  • sadf23ssaaa - Monday, March 22, 2010 - link

    Welcome to our website:

    jordan air max oakland raiders $34--39;
    Ed Hardy AF JUICY POLO Bikini $25;
    Christan Audigier BIKINI JACKET $25;
    gstar coogi evisu true jeans $35;
    coach chanel gucci LV handbags $36;
    coogi DG edhardy gucci t-shirts $18;
    CA edhardy vests.paul smith shoes $32;
    jordan dunk af1 max gucci shoes $37;
    EDhardy gucci ny New Era cap $16;
    coach okely Adidas CHANEL DG Sunglass $18;
  • zerobug - Monday, February 1, 2010 - link

    Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
    So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
    For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
  • abnderby - Thursday, September 3, 2009 - link

    Let me say this, I am a Senior Software QA Engineer, I have been testing windows, windows apps, DB's and web sites for over 10 year now. I am what you could consider an windows guru of sorts.

    I have off an on always gone and tried linux from red hat 5, 6, ubuntu, suse, fedora etc... Linux is not and has not been ready for mainstream users. Sure simple email, word docs web browsing it is ok.

    But in order to do many things I want to do and many advanced windows users the author and many commentors are right. Linux people need to get out of their little shell and wake up.

    Linux has such great potential to be a true contenderto windows and OSX. But it lacks simple usability. Out of the box it can come nowhere close to MS or Apple offerings. The out of the box experience is truly horrible.

    Hardware drivers? good luck I run RAID cards that have no support. Forget the newest graphics and sound cards. Connecting to shares just as the author mentioned a hassle of a work around.

    Again as stated elsewhere Linux needs someone who programs and or scripts to get things done right. I have neitherthe time or patience for such. I use command line when needed. I would rather have 2 or 3 clicks and I am done then have to remember every CLI for every thing I need to do.

    Time is money, time is not a commodity. Linus wastes too much time.

    It is geting better with each distro true. But It has been 11 years from red hat 5?? and Linux is not a whole lot better than it was then.

    What is needed if Linux really wants to make a stand in the desktop space, is a unified pull togeher ofall distro's. Sit down and truly plan out the desktop. Put together a solid platform that out of the box can really put the hurt on MS or Apple.

    Look what Apple did with OSX! And how many developers are wrking on it? How many developers are working on Linux all distro's? OSX is a jewel in 7 years it has matured much farther than any *nix distro. And has a following that cannot yet be challenged by any distro available.

    Why is it that when win2k came out Linux was claiming to be superior, and yet after 10 years of development it is hardly comparable to XP let alonevista/win 7 or OSX?

    You guys really need to wake up and smell the coffee!

  • Penti - Monday, September 7, 2009 - link

    Of course it's not ready for consumer desktops, there are no serious distributions for that.

    It means no DVD player OOB, no proprietary codecs, no video editing software, no proprietary drivers which works magically. Of course not is SLED and RHEL Desktop ready for normal users it's targeted for Linux admins to set up the environment. Community distributions won't have as easy time to be set up by those. Community distros will also always lack the above mentioned stuff. It's simply not legal for them to offer it OOB. OS X is actually older then Linux and ran on x86 before Apple bought Jobs NeXT company. It's also supported by an OEM. (OEM = Themselves). Which no Linux dist is. It also uses many GNU technologies like GCC, X11 (optional but included on disc), bash shell and so on, and of course SAMBA for SMB/CIFS, on the server edition they use a modified openldap server, dovecot and postfix for mail, Apache, PHP, Perl, MySQL etc. Stuff thats developed on Linux and has matured thanks to it.

    There's a lot of problems with having just community supported stuff, but that doesn't mean it's useless or sucks. Sure the kernel aren't really helping getting drivers in there, by locking out closed source stuff but they end up useless if they are proprietary and not updated any way. For the servers just buy RHEL or SLES certified stuff and you get all the hardware support-needed. But on the other hand you wouldn't be much successful in running 7 year old video drivers in Windows either. Community distros definitively don't need to cease existing for the creation of a commercial one. But there will never be one linux and that's really the beauty of it not the course. It wasn't meant to be something rivaling windows and the kernel developers has no desire to create a distro. That's why we can see Linux in stuff like Android and Maemo. And from home routers to mainframes and supercomputers. For a commercial entity targeting that many devices wouldn't be possible. Not with the same basic code and libraries. There are definitively some top notch products and solutions based on Linux and GNU. But Linux doesn't want anything as it's not an entity. And it's really up to GNOME and KDE to create the desktop environment. It's not the distros that shape them and write all the libraries that software developers use to create their software. As there are no major consumer desktop distro maker there is also no one that can really steer them by sponsoring work and holding discussions either. Not towards a unified desktop environment for normal non-tech users anyway. Also GNOME and KDE has no desire to create a exclusive platform around their software. OS X is a innovative 20 year old OS (since commercial release) and is actually based on work before then (BSD code). OS X UI is really 20 years into it's making and builds heavily on the next/openstep framework. On other Unixes there hasn't been any such heritage to build on, X was an total mess on commercial Unixes and I would actually say it's a lot better and more streamline now. There's just Xorg now, sure there are a lot of window managers but only two major environments now so it's still better then when all the vendors had it's own and couldn't make up it's mind on which direction to go and standardize on. In the middle of the 90's there where like at least 4 major Unix vendors that all had their own workstations.
  • fazer150 - Friday, September 4, 2009 - link

    which Linux distro have you tried? did you try the PCLinuxOS which is atleast as usable as windows xp, 2003?
  • nilepez - Sunday, August 30, 2009 - link

    Most end users are not comfortable with the command line. Linux, even Ubuntu, is still not ready for the masses. This shouldn't be confused with the quality of the OS. It's mostly GUI issue. I've also had some issues with installers failing. Some were solved from an xterm and others just didn't work.

    It wasn't a big deal in most cases, because there's generally another program that can get the job done, but for the typical home user, it's a deal killer. Nevertheless, I must give credit where credit is due, and Ubuntu has made huge strides in the right direction. The UI isn't close to Windows 7 and I suspect it's not close to OS X either, but Canonical is moving in the right direction.

  • Etern205 - Thursday, August 27, 2009 - link

    See this is the problem with some of linux users, you guys are some what always closed in a nutshell. What you may think is easy does not mean the rest of the world will agree with you. In this day and age, people what to get things done quickly and use the least amount of time as possible. For Mac OS X and Windows getting a simple task done takes like 3 simple clicks, for Ubuntu performing the same tasks requires the user to do extensive amount of research just to complete it.

    I'm glad this article was written by a author who has not head into linux terriroty before and it shows the true side of linux from the perspective of a new user.

    If you like to do ramen coding and so forth does not mean the others will. If linux want's to become mainstream, then they really need to stand in the shoes of Joe or Jane.
  • forkd - Saturday, October 31, 2009 - link

    I use mac, windows and linux and I must disagree with your assessment of "this is the problem with some linux users"

    This article, and this site for that matter, comes from the perspective of a windows (and some mac) user looking at linux. More specifically Ubuntu. From this point of view of course Linux is difficult. A person who is linux focused thinks windows is difficult at first too and is likely to criticize. If you take the time to learn something instead of just criticizing something because it is different you may be a lot happier.
  • fepple - Thursday, August 27, 2009 - link

    Check out all the usability studies the Gnome Project does, then come back and make some more generalization :)
  • SoCalBoomer - Thursday, August 27, 2009 - link

    Again - those are done by Linux people. His points are right on. . .someone a while ago did a "Mom" test, which is closer to what is needed, not people who know computers doing studies on usability.

Log in

Don't have an account? Sign up now