It’s Secure

Security is a tough nut to crack, both with respect to making something secure and judging something to be secure. I’m going to call Ubuntu secure, and I suspect that there’s going to be a lot of disagreement here. Nonetheless, allow me to explain why I consider Ubuntu secure.

Let’s first throw out the idea that any desktop OS can be perfectly secure. The weakest component in any system is the user – if they can install software, they can install malware. So while Ubuntu would be extremely secure if the user could not install any software, it would not be very useful to be that way. Ubuntu is just as capable as any other desktop OS out there when it comes to catching malware if the user is dedicated enough. The dancing pigs problem is not solved here.

Nevertheless, Ubuntu is more secure than other OSes (and let’s be frank, we’re talking about Windows) for two reasons. The first is for practical reasons, and the second is for technical reasons.

To completely butcher a metaphor here: if your operating system has vulnerabilities and no one is exploiting them, is it really vulnerable? The logical answer to that is “yes” and yet that’s not quite how things work. Or more simply put: when’s the last time you’ve seen a malware outbreak ravaging the Ubuntu (or any desktop Linux distro) community?

Apple often gets nailed for this logic, and yet I have a hard time disagreeing with it. If no one is trying to break into your computer, then right now, at this moment, it’s secure. The Ubuntu and Mac OS X user bases are so tiny compared to that of Windows that attacking anything but Windows makes very little sense from an attacker’s perspective.

It’s true that they’re soft targets – few machines run anti-virus software and there’s no other malware to fend off – but that does not seem to be driving any kind of significant malware creation for either platform. This goes particularly for Mac OS X, where security researchers have been warning about the complacent nature this creates, but other than a few proof of concept trojan horses, the only time anyone seems to be making a real effort to break into a Mac is to win one.

So I am going to call Ubuntu, with its smaller-yet user base and lack of active threats, practically secure. No one is trying to break into Ubuntu machines, and there’s a number of years’ worth of history with the similar Mac OS X that says it’s not going to change. There just aren’t any credible threats to be worried about right now.

With that said, there are plenty of good technical reasons too for why Ubuntu is secure; while it may be practically secure, it would also be difficult to break into the OS even if you wanted to. Probably the most noteworthy aspect here is that Ubuntu does not ship with any outward facing services or daemons, which means there is nothing listening that can be compromised for facilitating a fully automated remote code execution attack. Windows has historically been compromised many times through these attacks, most recently in October of 2008. Firewalls are intended to prevent these kinds of issues, but there is always someone out there that manages to be completely exposed to the internet anyhow, hence not having any outward facing services in the first place is an excellent design decision.

Less enthusing about Ubuntu’s design choices however is that in part because of the lack of services to expose, the OS does not ship with an enabled firewall. The Linux kernel does have built-in firewall functionality through iptables, but out of the box Ubuntu lets everything in and out. This is similar to how Mac OS X ships, and significantly different from how Windows Vista ships, which blocks all incoming connections by default. Worse yet, Ubuntu doesn’t ship with a GUI to control the firewall either (something Mac OS X does), which necessitates pulling down a 3rd party package or configuring it via CLI.

Operating System Inbound Outbound
Windows Vista All applications blocked, applications can request an open port All applications allowed, complex GUI to allow blocking them
Ubuntu 8.04 All applications allowed, no GUI to change this All applications allowed, no GUI to change this
Mac OS X 10.5 All applications allowed, simple GUI to allow blocking them All applications allowed, no GUI to change this

Now to be fair, even if Ubuntu had shipped with a GUI tool for configuring its firewall I likely would have set it up exactly the same as how I leave Mac OS X set up – all incoming connections allowed – nevertheless I find myself scratching my head. Host-based firewalls aren’t the solution to all that ails computer security, but they’re also good ideas. I would rather see Ubuntu ship like Vista does, with an active firewall blocking incoming connections.

Backwards compatibility, or rather the lack thereof, is also a technical security benefit for Ubuntu. Unlike Windows, which attempts to provide security and still support old software that pre-dates modern security in Windows, Ubuntu does not have any such legacy software to deal with. Since Linux has supported the traditional *nix security model from the get-go, properly built legacy software should not expect free reign of the system when running and hence be a modern vulnerability. This is more an artifact of previous design than a feature, but it bears mentioning as a pillar of total security.

Moving on, there is an interesting element of Ubuntu’s design being more secure, but I hesitate to call it intentional. Earlier I mentioned how an OS that doesn’t let a user install software isn’t very useful, but Ubuntu falls under this umbrella somewhat. Because the OS is based heavily around a package manager and signed packages, it’s not well-geared towards installing software outside of the package manager. Depending on how it’s packaged, many downloaded applications need to be manually assigned an executable flag before they can be run, significantly impairing the ability for a user to blindly click on anything that runs. It’s genuinely hard to run non-packaged software on Ubuntu, and in this case that’s a security benefit – it’s that much harder to coerce a user to run malware, even if the dancing pigs problem isn’t solved.

Rounding out the security underpinnings of Ubuntu, we have the more traditional mechanisms. No-eXecute bit support helps to prevent buffer overflow attacks, and Address Space Layout Randomization makes targeting specific memory addresses harder. The traditional *nix sudo security mechanism keeps software running with user privileges unless specifically authenticated to take on full root abilities, making it functionally similar to UAC on Vista (or rather, the other way around). Finally, Ubuntu comes with the AppArmor and SELinux security policy features that enable further locking down the OS, although these are generally overkill for home use.

There’s one last issue I’d like to touch on when it comes to technical security measures, and that’s the nature of open source software. There is a well-reasoned argument that open source software is more secure because it allows for anyone to check the source code for security vulnerabilities and to fix them. Conversely, being able to see the source code means that such vulnerabilities cannot be completely obscured from public view.

It’s not a settled debate, nor do I intend to settle it, but it bears mentioning. Looking through the list of updates on a fresh Ubuntu install and the CERT vulnerability list, there are a number of potential vulnerabilities in various programs included with Ubuntu – Firefox for example has been patched for vulnerabilities seven times now. There are enough vulnerabilities that I don’t believe just counting them is a good way to decide if Ubuntu being open source has a significant impact on improving its security. Plus this comes full-circle with the notion of Ubuntu being practically secure (are there more vulnerabilities that people aren’t bothering to look for?), but nevertheless it’s my belief that being open source is a security benefit for Ubuntu here, even if I can’t completely prove it.

Because of the aforementioned ability to see and modify any and every bit of code in Ubuntu and its applications, Ubuntu also gains a security advantage in that it’s possible for users to manually patch flaws immediately (assuming they know how) and that with that ability Ubuntu security updates are pushed out just about as rapidly as humanly possible. This is a significant distinction from Windows and Patch Tuesday, and while Microsoft has a good business reason for doing this (IT admins would rather get all their patches at once, rather than testing new patches constantly) it’s not good technical reasoning. Ubuntu is more secure than Windows through the virtue of patching most vulnerabilities sooner than Windows.

Finally, looking at Ubuntu there are certainly areas for improvement with security. I’ve already touched on the firewall abilities, but sandboxing is the other notable weakness here. Windows has seen a lot of work put into sandboxing Internet Explorer so that machines cannot get hit with drive-by malware downloads, and it has proven to be effective. Both Internet Explorer and Google’s Chrome implement sandboxes using different methods, with similar results. Meanwhile Chrome is not ready for Linux, and Firefox lacks sandboxing abilities. Given the importance of the browser in certain kinds of malware infections, Ubuntu would benefit greatly from having Firefox sandboxed, even if no one is specifically targeting Ubuntu right now.

It’s Free – Libre Ubuntu – Long Term Support
Comments Locked

195 Comments

View All Comments

  • viciki123 - Monday, February 22, 2010 - link

    I have a new website!wedding dresses uk:http://www.weddingdressonlineshop.co.uk">http://www.weddingdressonlineshop.co.uk
  • Jillian - Thursday, December 29, 2011 - link

    YOu are spamer .For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities. http://www.idresses.co.uk
  • zerobug - Monday, February 1, 2010 - link

    Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
    So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
    For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
  • MarcusAsleep - Thursday, December 17, 2009 - link

    Quick Startup: OK, Windows is fast - at first, well let's say that is if you install it yourself without all the bloatware that come standard on Windows store-bought PS's (we bought a Toshiba laptop for work with Vista that took 12 minutes after boot-up for it to respond to a click on the start menu - even on the third time booting.)

    Windows startup is often burdened by auto-updates from Microsoft, anti-virus, Sun-Java, Acrobat Reader, etc. etc. that slow down the computer on boot-up to where your original idea of "hey I just want to start my computer and check my email for a minute before work" can take at least 5. I can do this reliably on Linux in 1. Yes, if you know a lot about Windows, you can stop all the auto-updates and maintain them yourself but 99% of Windows users don't have time/or know how to do this.

    Trouble-free: E.G. I installed Linux on a computer for my wife's parents (Mepis Linux) 7 years ago for email, pictures, games, web, letter use and haven't had to touch it since then. This is typical.

    For Windows, often I have done fresh installs on trojan/virus infected computers - installed working antivirus and all Windows Updates (not to mention this process takes about 2-4 hours of updates upon updates + downloads of the proper drives from the manufacturers websites vs about 1 hour for an Ubuntu install with all updates done including any extra work for codecs and graphic drivers) - only to have to come back a couple months later to a slow again computer from users installing adware, infected games, etc.

    Free: Nearly every Windows reinstall I've had to do starts with a computer loaded with MS Office, games, etc. but upon reinstall nobody has the disks for these. There is a lot of "sharing" of computer programs in the Windows world that is not very honest

    With Linux, you can have the operating system plus pretty much anything else you would need, without having to cheat.

    Adaptable Performance: You can get a well-performing Linux installation (LXDE Desktop) on a PIII computer with 256MB of ram. The only thing that will seem slow to an average mom/pop user would be surfing on flash loaded web pages, but with adblock on Firefox, it's not too bad. With Vista loaded on this computer, it would be annoyingly slow. You can often continue to use/re-use computer hardware with Linux for years after it would be unsuitable for Windows.

    I think these features are of high value to the average user -- maybe not the average Anandtech computer user -- but the average surf/email/do homework/look at photos/play solitaire/balance my checkbook user.

    Cheers!

    Mark.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    Using SMB for network performance is extremely biased. It's a proprietary Microsoft protocol, of course Microsoft is going to win that one. Use NFS, HTTP, FTP, SSH or some other open protocol for network performance benchmarking. Alot of NASes do support these, as they are Linux-based.

    Furthermore, using a Windows server with SMB with the argument that most consumer NAS use SMB is pretty ridiculous, these NASes are most likely going to use Samba, not native SMB, the Samba which is implemented in GNU/Linux distributions and Mac OS X, not to mention that most of the NASes that I've seen offer at least one of these protocols as an alternative.
  • SwedishPenguin - Wednesday, October 28, 2009 - link

    The ISO thing is pretty ridiculous, creating a simple GUI in both GTK and Qt and integrating them into Gnome and KDE should be pretty damn easy, though I suppose integration with the respective virtual file systems would be in order, in which case it might get slightly more complex for those (like me) not familiar with the code. There's even a FUSE (userspace filesystem) module now, so you wouldn't even need to be root to mount it.

    About the command-line support, IMO that's a good thing. It's a lot easier both for the person helping and the guy needing help to write/copy-paste a few commands than it is to tell the person to click that button, then that one then another one, etc. It's also alot easier for the guy needing help to simply paste the result if it didn't help, and it makes it much easier to diagnose the problem than if the user would attempt to describe the output. And you usually get much more useful information from the command-line utilities than you do from GUIs, the GUI simplifies the information so anyone can understand it, but at the price of making debugging a hell of a lot more difficult.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • nillbug - Wednesday, September 30, 2009 - link

    It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.

    If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.

    In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
  • seanlee - Tuesday, September 15, 2009 - link

    I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely hate it.
    1. User friendly. No, CLI is not user friendly no matter what you say; no matter what excuse you use; no matter how blind you are. NOT ONE COMPANY dare to provide their mainstream products to be CLI only, from things as simply as ATM, ipod, to things as complicate as cellphone, cars, airplane. That ought to tell you something--- CLI is not intuitive, not intuitive=sucks, so CLI = sucks. You command line fan boys are more than welcome to program punched cards, expect no one use punched cards and machine language anymore because they are counter-intuitive. Having to do CLI is a pain for average user, and having to do CLI every time to install a new program/driver is a nightmare. GUI is a big selling point, and a gapless computer-human user experience is what every software company looking to achieve.
    2. There is NOTHING a Linux can do that windows cannot. On the contrary, there are a lot of things windows can do that Linux cannot. I’d like to challenge any Linux user to find engineering software alternatives on Linux, like matlab, simulink, xilinx, orcad, labview, CAD… you cannot. For people who actually user their computer for productive means (not saying typing documents are not productive, but you can type document using type writer with no CPU required whatsoever), there is nothing, again, I repeat, NOTHING that Linux can offer me.
    3. Security issues. I disagree with the security issues that windows has. I can set up a vista machines, turn it on, luck it into a cage, and it will be as security as any Linux machine out there. Hell. If I bought a piece of rock, pretend it was a computer and stare it all day, it would be the most secure system known to the man-kind. Linux’s security is largely due to one of the two reasons: 1. Not popular, not enough software to support and to play with. 2. Not popular, un user-friendly. Either of them is not a good title to have. It is like you are free from the increase of the tax not because you have your business set up to write off all your expense, but because you don’t make any money thus you don’t have to pay tax.
    4. There is nothing revolutionary about Linux for an average user, other than it is free. If free is your biggest selling point, you are in serious trouble. Most people, if not all, would pay for quality product than a free stuff, unless it is just as good. Obviously Ubuntu is never going to be as good as windows because they don’t have the money that MS has. So what does Ubuntu have that really makes me want to switch and take few weeks of class to understand those commands?

    Be honest, people. If you only have ONE O/S to use, most of you guys will chose windows.
  • kensolar - Monday, October 26, 2009 - link


    I hope you realize that your hated is showing so strongly that absolutely no one cares what you say.
    That said, I don't know how to use a cli and have been successfully using Linux for 3 years. I found the article to be a fairly fair one even though the author is so unfamiliar with Linux/Ubuntu. As he does not use the default app's in windows, linux users don't use the defaults only in linux. K3B is far superior to Brassaro and so on. In addition, I don't think he let on very well as to the extent of the software available in the repositories (with addition repositories easy to add). Several hundred app's, 20,000 programs, even security app's and programs ranging from easy as pie to complicated. (for those of us how have a computer that is more than a rock) I personally do audio mixing, video transcoding, advanced printing....all with graphic interfaces.
    BTW, I learned how to turn on a computer for the 1st time 3 1/2 years ago, I stopped using windows a little over 3 years ago and have no reason to go back. I find it too hard, limiting and frustrating to use. Plus, I can't live w/o multiple desktops, the author didn't get it yet, but once you get used to them you can't go back.
    Well, I've said enough for now, can't wait for your next article.

Log in

Don't have an account? Sign up now